Main
The accelerating deployment of artificial intelligence (AI) servers is being fuelled by the growing demand for generative AI applications, ignited by milestones such as the release of ChatGPT in 20221. Projections signal even greater impact, exemplified by the recent Blackwell platform, which analysts herald as a new Moore’s Law era[2](https://www.nature.com/articles/s41893-025-01681-y#ref-CR2 “NVIDIA Blackwell platform arrives to power a new era of computing. Nvidia Newsroom https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing
(2024).“). Pronouncements from influ…
Main
The accelerating deployment of artificial intelligence (AI) servers is being fuelled by the growing demand for generative AI applications, ignited by milestones such as the release of ChatGPT in 20221. Projections signal even greater impact, exemplified by the recent Blackwell platform, which analysts herald as a new Moore’s Law era[2](https://www.nature.com/articles/s41893-025-01681-y#ref-CR2 “NVIDIA Blackwell platform arrives to power a new era of computing. Nvidia Newsroom https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing
(2024).“). Pronouncements from influential AI industry figures[3](https://www.nature.com/articles/s41893-025-01681-y#ref-CR3 “Announcing the Stargate Project. OpenAI https://openai.com/index/announcing-the-stargate-project/
(2025).“) on both demand and supply sides are seen as transformative shifts for the entire data-centre industry. Whereas AI has been employed in various fields to advance sustainability4,5, the remarkable energy requirements of AI itself raise concerns regarding not only energy provisioning challenges6 but also water scarcity and climate change issues stemming from the energy–water–climate nexus of AI data centres1,7,[8](https://www.nature.com/articles/s41893-025-01681-y#ref-CR8 “Krämer, K. AI & robotics briefing: data centres’ huge ‘water footprint’ becomes clear amid AI boom. Nature Briefing https://www.nature.com/articles/d41586-023-03768-y
(2024).“). However, the holistic energy–water–climate implications of AI computing are largely unknown, constrained by untransparent industry reports and limited data.
The climate impact of AI servers will stem primarily from their operations (Scopes 1 and 2)9 and supply-chain activities (Scope 3), including manufacturing and end-of-life treatment10,11. The Scope 2 emissions from indirect energy purchases are expected to constitute a substantial portion and indicate a high dependency on the increase of AI server energy consumption. According to the International Energy Agency[12](https://www.nature.com/articles/s41893-025-01681-y#ref-CR12 “Data centres and data transmission networks. IEA https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks
(2024).“), 0.6% of global total carbon emissions comes from the data centres and data transmission networks due to their electricity consumption. The industry energy consumption could double by 2026, motivated by AI and other sectors[13](https://www.nature.com/articles/s41893-025-01681-y#ref-CR13 “Electricity 2024: Analysis and Forecast to 2026 (IEA, 2024); https://www.iea.org/reports/electricity-2024
“), threatening decarbonization targets under the Paris Agreement[14](https://www.nature.com/articles/s41893-025-01681-y#ref-CR14 “Walsh, N. How Microsoft measures datacenter water and energy use to improve Azure Cloud sustainability. Microsoft Azure Blog https://azure.microsoft.com/en-us/blog
(2022).“),[15](https://www.nature.com/articles/s41893-025-01681-y#ref-CR15 “Guidance for ICT companies setting science based targets. SBTi https://sciencebasedtargets.org/sectors/ict
(2020).“), which include a 53% reduction in data-centre emissions by 2030 and net-zero goals for the AI sector. Further, growing AI server energy consumption implies an increasing water footprint through Scope 1 (direct cooling) and Scope 2 (indirect procurement) water use[8](https://www.nature.com/articles/s41893-025-01681-y#ref-CR8 “Krämer, K. AI & robotics briefing: data centres’ huge ‘water footprint’ becomes clear amid AI boom. Nature Briefing https://www.nature.com/articles/d41586-023-03768-y
(2024).“),16. Centralized installation in water-stressed regions may perturb local water balance and threaten supply for millions17,18.
Previous research has developed bottom-up and top-down methods to assess energy–water–carbon outcomes of servers[19](#ref-CR19 “Shehabi, A. et al. United States Data Center Energy Usage Report (Berkeley Lab, 2016); https://eta.lbl.gov/publications/united-states-data-center-energy
“),20,21, but these approaches face challenges with the rise of AI servers. Top-down approaches based on activity indices, such as data traffic and computing instances, fail to accurately represent AI-driven workloads22,23. Detailed bottom-up approaches suffer from limited data availability and lack of industry insight7. Assumptions valid for traditional collocation centres often fail for AI data centres, which differ in installation and operation24. Recent studies have explored computing task-based analyses to better quantify AI-related energy and resource consumption, providing insights but lacking systematic policy guidance25,26. Two notable contributions are the 2024 data-centre report by Lawrence Berkeley National Laboratory[27](https://www.nature.com/articles/s41893-025-01681-y#ref-CR27 “Shehabi, A. et al. 2024 United States Data Center Energy Usage Report (Berkeley Lab, 2024); https://eta.lbl.gov/publications/2024-lbnl-data-center-energy-usage-report
“) and the 2025 Energy and AI Report by the International Energy Agency[28](https://www.nature.com/articles/s41893-025-01681-y#ref-CR28 “Energy and AI (IEA, 2025); https://www.iea.org/reports/energy-and-ai
“), which present scenarios estimating US and global data-centre energy use under highly uncertain AI growth. Although use of confidential commercial data may limit reproducibility, they offer important benchmarks. Our study extends this foundation by (1) developing an open-source, bottleneck-based modelling approach with comprehensive uncertainty analysis; (2) systematically assessing energy, water and carbon impacts, incorporating dynamic interactions with local energy systems; and (3) proposing actionable mitigation strategies for potential net-zero trajectories across different AI server deployment scenarios.
Here we analyse the combined energy–water–climate impact of operational AI servers in the United States between 2024 and 2030, balancing importance and future uncertainties and addressing a series of fundamental questions. (1) What are the magnitude and spatiotemporal distributions of energy consumption, water footprint and climate impact from AI server deployment? We address this using temporal projection models and regional frameworks, assuming deployment mirrors current large-scale AI data-centre patterns. (2) What are the prospects for near-term net-zero pathways? We evaluate this by analysing best- and worst-case scenarios of key drivers, including industry efficiency improvements, server location distribution and grid decarbonization. The results of these analyses are presented in the following sections.
AI servers’ environmental impact in the United States
This section provides a base depiction of the AI servers’ energy, water, and carbon impacts in the United States, emphasizing the spatiotemporal characteristics of the system. Figure 1a shows the projected cumulative AI server installations in the United States from 2024 to 2030 under five scenarios: low demand, low power, mid-case, high application and high demand. The mid-case serves as the base scenario, while the low and high demand set projection bounds. Low power assumes server efficiency gains, and high application accounts for increased adoption driven by efficiency. Figure 1b illustrates the state-level allocation of AI servers, showing power usage effectiveness (PUE), projected grid carbon intensity (carbon emissions per unit of electricity), water usage effectiveness (WUE) and projected grid water intensity (water footprint per unit of electricity) for each state. Southern states such as Florida exhibit higher PUE and WUE than northern states such as Washington, reflecting climate impacts. Moreover, the grid factors demonstrate notable sensitivity to location, implying the importance of the local grid for the AI servers’ environmental impacts. Figure 1c–g shows energy, water and carbon results. Figure 1c illustrates AI server energy predominates over infrastructure energy. Figure 1d indicates indirect water footprint contributes 71% of total, with direct use at 29%. Annual estimates of energy consumption, water footprint and carbon emissions of AI servers from 2024 to 2030 under each scenario are presented in Fig. 1e–g. Even the lowest scenarios outline a considerable increase in the energy, water and carbon footprints of AI servers. The highest scenario yields the highest environmental impact, largely surpassing previous forecasts for the entire US data-centre market18,[29](https://www.nature.com/articles/s41893-025-01681-y#ref-CR29 “Investing in the rising data center economy. McKinsey & Company https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy
(2023).“), underscoring the environmental risks of unchecked AI server expansion.
Fig. 1: Projections of energy, water and carbon footprints of the installed AI servers from 2024 to 2030. Each scenario is denoted by a different colour.
a, The projected accumulative capacity of AI servers in the United States from 2024 to 2030 under different scenarios. b, Spatial allocation data for each state, accompanied by corresponding metrics, which include PUE, WUE, grid carbon factor (kgCO2-equivalent kWh−1), and grid water factor (L kWh−1) data. The metric values are calculated as the average value from 2024 to 2030. c, The ratios between AI servers and infrastructure energy consumption. d, The ratios of the indirect and direct water footprint of AI servers. e–g, The energy consumption (e), water footprint (f) and carbon emissions (g) of AI servers from 2024 to 2030 under different scenarios. The red dashed lines in e–g denote the forecast footprint of the US data centres, based on previous literature18,[29](https://www.nature.com/articles/s41893-025-01681-y#ref-CR29 “Investing in the rising data center economy. McKinsey & Company https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/investing-in-the-rising-data-center-economy
(2023).“).
The United States is selected as the research region because of its dominant position in the global AI market. The research period is selected as 2024–2030, generated from a trade-off between importance and uncertainty. Projection of the AI server accumulative capacity in the United States is the initial step, which was generated on the basis of the forecast of AI chip manufacture capacity, AI server specifications and AI server adoption pattern. In addition, we assume the AI data centre will align with current AI company large-scale data-centre allocation ratios, presented in detail in Supplementary Fig. 5. The PUE and WUE values are derived by a hybrid statistical and thermodynamics-based model30. Supplementary Table 3 lists all model inputs, and the applied values are calculated as the average of the best and worst practices. In addition, projected grid factors are calculated on the basis of the Regional Energy Deployment System (ReEDS) model[31](https://www.nature.com/articles/s41893-025-01681-y#ref-CR31 “Ho, J. et al. Regional Energy Deployment System (ReEDS) Model Documentation Version 2020 (National Renewable Energy Lab, 2021); https://docs.nrel.gov/docs/fy21osti/78195.pdf
“) by involving the projected data-centre load data and adopting regulations such as the Inflation Reduction Act. The step-by-step calculation process of the projection is summarized in Methods and sections 1–4 of Supplementary Information. Important uncertainties, such as PUE and WUE estimation, technology advancements, the spatial distribution of AI server allocation and grid development patterns, and sensitivity analysis of key parameters will be discussed in the following sections.
Higher energy and water usage efficiencies
Over the past decade, efficiency gains in the data-centre industry have stabilized environmental costs despite a doubling of computing instances20. This section examines the existing potential for further improvements through system optimization and technology adoption. Figure 2a,b illustrates the achievable PUE and WUE values of AI data centres in each state. The best-practice scenario suggests notable reduction potential: over 7% PUE reduction and over 85% WUE reduction, despite the high efficiency of AI data centres compared with the industry averages (PUE 1.58, WUE 1.8). The worst-practice scenario underscores the risk of neglecting efficiency efforts. The effects of achievable PUE and WUE values are further depicted in Fig. 2c,d, showcasing the corresponding influences on the energy, water and carbon footprints of AI servers. PUE reduction yields over 7% reductions in total energy consumption and carbon emissions. WUE reduction efforts result in over a 29% reduction in the total water footprint. Moreover, it is evident that PUE and WUE efforts can align with each other, as observed in WUE improvement results. Figure 2e delves into the potential impact of adopting advanced technologies within AI data centres, focusing on advanced liquid cooling (ALC) and server utilization optimization (SUO) adoption. The results show that the best ALC adoption can reduce about 1.7% of energy consumption, 2.4% of water footprint and 1.6% of carbon emissions of AI servers by 2030. For SUO, the best-case scenario, representing total adoption by 2030, results in a 5.5% reduction in all footprint values, while the worst-case scenario, representing frozen adoption, leads to a 7.3% increase by 2030. The energy–water–carbon impacts of AI servers from 2024 to 2030 following the mid-case scenario through the worst, base and best industry practices are further presented in Fig. 2f–h. The maximum reductions of energy, water and carbon drawn from the existing potential are about 12%, 32% and 11%, respectively. These findings underscore the considerable impact of industry efforts on the environmental cost of AI servers.
Fig. 2: Assessment of industry efforts aimed at reducing the environmental impact of AI servers.
a, Comparison of the best, base and worst practices regarding AI data-centre PUE. b, Comparison of the best, base and worst practices regarding AI data-centre WUE. c, Analysis of the impact of PUE practices on energy consumption, water footprint and carbon emissions. d, Analysis of the impact of WUE practices on energy consumption, water footprint and carbon emissions. e, Assessment of the effect of ALC and SUO adoption on energy consumption, water footprint and carbon emissions. f–h, The energy consumption (f), water footprint (g) and carbon emissions (h) of AI servers from 2024 to 2030 following the mid-case scenario through the worst, base and best practices of all considered industry efforts.
Base values of PUE and WUE are calculated by averaging the best and worst practices due to the lack of cooling specifications. The worst and best values are derived by solving the corresponding optimization problem with constrained operational parameters. Specifically, the best PUE is achieved mainly by extending the free cooling period through input air set-points adjustment and enhancing facility energy efficiency. WUE is improved by reducing windage and concentration water loss, adopting air-side economizers and enabling more free cooling. The model is based on previous works30, with relevant parameters detailed in section 3 of Supplementary Information. The best, base and worst practices for ALC and SUO adoption are developed from existing studies and market reports, with assumptions and calculations outlined in Methods. ALC adoption is evaluated through the increased immersion cooling in AI data centres, while SUO improvements focus on raising the active server ratio. Importantly, the advancement in AI hardware could reduce energy, water and carbon footprints by improving energy efficiency, as seen in Nvidia’s future chip structures Blackwell and Rubin. However, these gains may be offset by the rebound effect. The uncertainties of the hardware evolution and market dynamics are discussed in the definition of the scenarios, as detailed in section 1 of Supplementary Information.
Influences of AI server spatial distribution
The location of data centres critically shapes their environmental impact18,32. This section presents a detailed projection of how AI server spatial distribution may influence the environmental consequences of rapid US expansion. Figure 3a,b presents the top 25%, 50% and 75% locations with the lowest projected water footprint and carbon emissions per unit of server energy. Figure 3c presents the locations with combined lowest water and carbon factors. These allocation strategies are conducted on a grid balancing-area level. Specifically, the ReEDS model is deployed to calculate the grid factors of each balancing area under different projection scenarios. Figure 3d shows that Texas plays a vital role by possessing the most balancing areas with the top 25%, 50% and 75% water and carbon factors. The other western and southern states, including Montana, Louisiana, Idaho and New Mexico, also take a large portion of areas under the combined strategies due to widely leveraged local renewables. West Coast states, such as California, Oregon and Washington, as well as New England states, are suitable for carbon reduction but also lead to higher water footprints. The primary driver behind is the adoption of hydropower, which consumes large volumes of water through evaporation, as detailed in Supplementary Fig. 6. With more hydropower applied in the grid, the Scope 2 water usage of AI servers could dramatically increase when installed in such locations. Figure 3e–g depicts the total energy consumption, water footprint and carbon emissions changes of AI servers under the 25%, 50% and 75% spatial distribution scenarios. The base scenario is applied to all simulations to identify solely the influences of spatial distribution. Installing AI servers in the top water-oriented locations results in large reductions of both water and carbon footprints. Conversely, the carbon-oriented strategies lead to higher water footprints due to the hydropower application on the West Coast and New England states. Moreover, Fig. 3f shows that the water and carbon footprints can be concurrently reduced following a combined allocations strategy.
Fig. 3: Impact of spatial distribution on water footprint and carbon emissions of AI servers.
a, The top 25%, 50% and 75% of locations with the lowest projected water footprint per unit energy (2024–2030). b, The top 25%, 50% and 75% of locations with the lowest projected carbon emissions per unit energy. c, Locations ranked in the top 25%, 50% and 75% for both lowest water and carbon per unit energy. d, State area ratios under water- and carbon-oriented allocation strategies. e, Changes in total energy–water–carbon footprints for the 25%, 50% and 75% water-oriented scenarios. f, Changes in total water–carbon footprints for the 25%, 50% and 75% carbon-oriented scenarios. g, Changes in total energy–water–carbon footprints for the 25%, 50% and 75% combined water- and carbon-oriented scenarios. h, Current water footprint and water scarcity per unit server energy by state. i, Current carbon emissions per unit server energy and renewable energy potential by state.
To project the potential risks and benefits behind future AI server deployment, Fig. 3h,i illustrates the current water scarcity and renewable energy potential of each state. The top ten states grappling with severe water scarcity issues are California, Nevada, Arizona, Utah, Washington, New Mexico, Colorado, Wyoming, Oregon and Montana, which are concentrated primarily in the western United States. Large adoption of hydropower leads to increased unit energy–water footprint in several states such as Arizona, California, Nevada, New Mexico and Utah, exacerbating the water scarcity issue. In addition, Texas, New Mexico, Kansas, Arizona, California, Colorado, Nevada, Nebraska, Oklahoma and South Dakota are identified as the top ten states with abundant renewable energy potentials. Combined with the water- and carbon- oriented strategies, Texas, Montana, Nebraska and South Dakota, situated in the Midwestern United States, emerge as optimal candidates for AI server installation, considering both water scarcity concerns and future decarbonization efforts.
Influence of renewable energy penetration within grid
This section analyses how future grid development will affect AI server environmental impacts. Figure 4a,b illustrates the modifications of grid carbon and water factors of each state under low renewable energy cost (LRC, best-practice) and high renewable energy cost (HRC, worst-practice) scenarios, compared with the base scenario. These scenarios, extracted from the predefined cases of the ReEDS model[31](https://www.nature.com/articles/s41893-025-01681-y#ref-CR31 “Ho, J. et al. Regional Energy Deployment System (ReEDS) Model Documentation Version 2020 (National Renewable Energy Lab, 2021); https://docs.nrel.gov/docs/fy21osti/78195.pdf
“),[33](https://www.nature.com/articles/s41893-025-01681-y#ref-CR33 “2023 electricity ATB technologies and data overview. NREL https://atb.nrel.gov/electricity/2023/index
(2023).“), represent the highest and lowest levels of renewable energy penetration, respectively. The resulting total carbon emissions and water footprint of AI servers are depicted in Fig. 4c,d. The HRC scenario indicates a 20% increase in carbon emissions alongside a 2.0% increase in water footprint, while the LRC scenario suggests an over 15% reduction in carbon emissions accompanied by a 2.5% of water footprint reduction. The carbon emissions of AI servers are shown to be heavily influenced by the grid decarbonization pattern, indicating both considerable reduction potential and associated risks.
Fig. 4: Assessment of the impact of grid renewable energy penetration on water footprint and carbon emissions of AI servers.
a, Modifications of the grid water factor under the best and worst scenarios compared with the base scenario. b, Modifications of the grid carbon factor under the best and worst scenarios compared with the base scenario. c,d, The water footprint (c) and carbon emissions (d) of AI servers from 2024 to 2030 following the mid-case scenario under best, base and worst scenarios, respectively. e, The changes of AI server carbon and water footprints of each state under the best (LRC) and worst (HRC) scenarios.
Figure 4e details changes in AI server carbon emissions and water footprints by state under LRC and HRC scenarios. The development pattern, including renewable energy penetration levels and sources, notably impacts the resulting footprints of each state. States such as Georgia, Nevada, North Carolina and Tennessee show marked sensitivity to renewable cost scenarios. In addition, the Pacific states, including California, Oregon and Washington, which achieved a low grid carbon factor, slow their hydropower adoption pace under the LRC scenario, avoiding exacerbating their water scarcity issues with additional AI server installations. The wind and solar resources, which have no water consumption while generating, are further adopted in the LRC scenario to reduce carbon emissions, meanwhile alleviating the severe water scarcity challenges in these states. The presented results suggest that the impacts of grid decarbonization patterns on the environmental costs of AI servers are evident not only over time but also in the spatial variations observed among different US states. These results connect to the importance of ambitious green electricity policies in US states, which potentially could further reduce carbon emissions in our scenarios.
Pathways to net-zero carbon and water goals
Building on the preceding analysis of environmental costs, efficiency, spatial distribution and grid decarbonization, this section evaluates water and carbon net-zero pathways for US AI servers. Figure 5a presents the pathways from 2024 to 2030 to achieve net-zero carbon emissions and a net-zero water footprint of AI servers under the mid-case scenario. Figure 5b presents residual emissions and water footprints across different scenarios and practices. The 2030 target aligns with major AI data-centre operator goals[14](https://www.nature.com/articles/s41893-025-01681-y#ref-CR14 “Walsh, N. How Microsoft measures datacenter water and energy use to improve Azure Cloud sustainability. Microsoft Azure Blog https://azure.microsoft.com/en-us/blog
(2022).“),[34](https://www.nature.com/articles/s41893-025-01681-y#ref-CR34 “Meta Sustainbility: Connecting to a Better Reality (Meta, 2024); https://sustainability.atmeta.com/
“),[35](https://www.nature.com/articles/s41893-025-01681-y#ref-CR35 “Innovating Across Our Operations and Supply Chain (Google Sustainability, 2024); ( https://sustainability.google/operating-sustainably/
“). The top and bottom 25% of locations are used to create the best and worst cases for spatial distribution. The best and worst grid development practices are modelled under LRC and HRC scenarios, respectively. These practices form the upper and lower bounds, although better solutions can be obtained through extra policies and actions beyond current considerations.
Fig. 5: The pathways towards achieving net-zero carbon emissions and water footprints for US AI servers.
a, The contributions of each influential factor to the water footprint and carbon emissions of AI servers with best and worst practices under the mid-case scenario. The total contributions of each factor from 2024 to 2030 are also listed. The curves above the 0 level indicate the increase of carbon emissions and water footprint, while the curves below represent the reductions. The grey area represents the residual footprints that need to be reduced. b, Presentation of the capacity of residual carbon emissions and water footprints to attain net-zero targets under different temporal scenarios for each year spanning from 2024 to 2030, under both best- and worst-practice scenarios. Specifically, the top and bottom 25% of locations are used to create the best and worst cases for spatial distribution. The best grid development practice is modelled under the LRC scenario, and the worst is modelled under the HRC scenario.
The best and worst distribution patterns of AI server deployment lead to a 49% reduction and a 90% increase in carbon emissions, and a 52% reduction and a 354% increase in water footprints, respectively. Optimistic grid decarbonization provides a further 13% reduction in carbon, while the worst case results in a 23% increase. Industry efficiency efforts are critical for water sustainability, offering over 32% reduction in water footprint under best practices. Notably, combined best practices cut residual emissions and water footprints by 73% and 86%, respectively, indicating a feasible pathway to net zero. Under the mid-case, best industry, spatial and grid scenarios each reduces over 21 Mt, 25 Mt and 92 Mt of carbon from a base of 186 Mt due to AI server installation. The best practices produce about 11 Mt residual carbon emissions by 2030, requiring 28 GW of wind or 43 GW of solar to fully offset[36](https://www.nature.com/articles/s41893-025-01681-y#ref-CR36 “Ritchie, H., Roser, M. & Rosado, P. Renewable energy. Our World in Data https://ourworldindata.org/renewable-energy
(2020).“). With over 13 GW of AI company renewables already claimed[37](https://www.nature.com/articles/s41893-025-01681-y#ref-CR37 “Companies with the largest operating capacity of clean power in the United States as of end of 2022. Statista https://www.statista.com/statistics/1375798/clean-power-operating-capacity-by-company-us
(2023).“), residual emissions could remain manageable with further expansion. However, achieving this best-case scenario could be extremely challenging, particularly due to the facility constraints in deploying AI servers at optimal locations and the difficulty in reaching ideal energy and water efficiency levels within AI data centres. Conversely, the worst practices pose a risk of unachievable net-zero pathways, signifying 71 Mt annual residual carbon emissions and over 5,224 million m3 annual residual water usage by 2030 under the low-demand scenario. Such an environmental cost is nearly impossible to be fully compensated during a short period.
Projected net-zero pathways could be further influenced by other uncertainties. Figure 6 presents a sensitivity analysis of key factors: server lifetime, AI server manufacturing capacity, US allocation ratio, server idle and maximum power ratios, and training/inference distributions. Low, applied and high values of each parameter are listed in the figure; left and right halves present the effects of lower and higher values, respectively. Changes in these factors result in up to 40% variations in energy consumption, which are closely mirrored by corresponding shifts in water footprint and carbon emissions. Unmodelled uncertainties can largely be reflected by modifying these simulation factors. For example, innovations such as DeepSeek can lead to low power requirements for AI computing tasks, which then may result in a rebound effect incorporating more AI applications. This statement may imply larger chip-on-wafer-on-substrate (CoWoS) manufacturing capacity due to expanding applications as well as longer lifetime and smaller training/job ratio due to more inference-based usage. As Fig. 6 indicates, the study’s key conclusions could remain robust unless future uncertainties greatly exceed modelled ranges. The modelling approach used in this study enables future revisions with more available data. The sensitivity analysis aims to capture the potential uncertainties inherent to this problem, offering insights into the magnitude of their influence on projection.
Fig. 6: Sensitivity of AI server energy–water–carbon impacts to key modeling assumptions.
Sensitivity analysis for AI server energy consumption, water footprint and carbon emissions considering uncertainties of server lifetime, manufacturing capacities for AI servers, US allocation ratio, server idle power ratio, server maximum power ratio and training/inference distributions. The listed numbers for each parameter from left to right represent the considered low, applied and high values.
Discussion
Investment in AI servers is accelerating, as seen in projects such as the $500 billion Stargate[3](https://www.nature.com/articles/s41893-025-01681-y#ref-CR3 “Announcing the Stargate Project. OpenAI https://openai.com/index/announcing-the-stargate-project/
(2025).“). While AI advancement is a key priority, our study highlights its environmental impact. To mitigate these risks, we identify that concentrating AI server deployment in Midwestern states—Texas, Montana, Nebraska and South Dakota—is optimal, given their abundant renewables, low water scarcity and favourable projected unit water and carbon intensities. These states possess substantial untapped wind and solar resources, enabling robust green power portfolios and reducing competition with other sectors. Their lower water stress also helps ease public concerns and reduces the need for costly water-saving measures. However, several implementation challenges must be acknowledged. Texas, crucial to the optimal strategy, may need to support an additional 74–178 TWh of AI server demand, possibly exceeding its current total renewable generation of 139 TWh (ref. [38](https://www.nature.com/articles/s41893-025-01681-y#ref-CR38 “Lyu, C., Fleckenstein, S. & Nerod, Z. Texas Energy Policy Landscape and Analysis Report (Center for Local, State, and Urban Policy, 2024); https://closup.umich.edu/sites/closup/files/2024-06/closup-wp-64-Texas-Energy-Policy-Landscape-Analysis.pdf
“)). This scale-up would require substantial investment in new renewable capacity and transmission infrastructure, which is already constrained by existing congestion39. Meanwhile, Montana, Nebraska and South Dakota currently host minimal data-centre capacity, indicating that most of their existing internet infrastructure may support only residential or standard industrial applications. This raises potential connectivity and security concerns for high-performance AI services. Enabling AI-grade infrastructure in these regions will require broadband and security upgrades, with associated emissions and capital costs increasing the expense of the optimal strategy. These challenges underscore the complexity of sustainable siting decisions and highlight the need for strategic coordination to ensure that AI investments support both technological leadership and long-term sustainability goals.
Expanding AI servers in the suggested regions may be further obstructed by public health concerns and other impacts on local sustainable development. Operational demands and construction of supporting facilities could generate air and water pollution through fossil fuel use, substantial water consumption and large-scale construction and transportation. To address these issues while managing economic costs, we recommend that AI companies engage in public–private partnerships with local governments. Such partnerships could alleviate governmental budget pressures, create local jobs and mitigate health concerns by funding green power upgrades and monitoring systems, such as tracking PM2.5 (particulate matter with a diameter of 2.5 micrometres or less) levels, through combined AI investment and regulation. For the governments, implementing tax exemptions on a related facility could incentivize a win–win development process, and it is essential to ensure transparent accountability within public–private partnerships to prevent privatized gains at public cost. Beyond environmental benefits, these measures could stimulate local economic growth, as evidenced in Virginia[40](https://www.nature.com/articles/s41893-025-01681-y#ref-CR40 “Panel Discussion: Data Centers in Virginia (Virginia Senate Finance Committee, 2023); https://sfac.virginia.gov/pdf/retreat/2023%20Tysons/13.%20Datacenters%20Panel.pdf
“). This discussion outlines policy recommendations for both industry and legislative stakeholders that balance economic and environmental impacts. They also emphasize energy market adaptations as AI-driven hyperscale loads challenge utilities‘ ‘duty to serve’. Requirements to connect large data centres could lead to server collocation with energy generation, but as shown in this study, these choices should not crowd out renewable investments to decarbonize the economy. Finally, the effectiveness of the recommended policy may be influenced by broader political and economic factors beyond the scope of this study.
The challenges associated with the optimal spatial distribution of AI servers introduce substantial uncertainties to the projected reductions of 73% and 86% in carbon and water footprints, respectively, under best-practice scenarios. These challenges are unlikely to be offset by improvements in efficiency or decarbonization efforts beyond what has already been considered. Our best-case scenario for industry efficiency approaches the physical limits of AI data centres. Moreover, projections from the US Energy Information Administration offer limited support for additional grid decarbonization compared with the considered best case[41](https://www.nature.com/articles/s41893-025-01681-y#ref-CR41 “Annual Energy Outlook 2023 (EIA, 2023); https://www.eia.gov/outlooks/aeo/
“). These constraints suggest that, without additional interventions, AI data centres are likely to generate substantial environmental impacts in the coming years. Consequently, AI companies are expected to continue investing in the offset mechanisms, including power purchase agreement, carbon removal and water restoration, reaching an unprecedented level of reliance on them to accomplish their net-zero aspirations by 2030. Nevertheless, severe concerns could be raised regarding the complexity of securing long-term contracts and providing convincing reduction credits9,42, especially bearing in mind the considerable capacity that would be needed. We argue that AI companies should shift to a more transparent approach by closely cooperating with third-party verification groups, service providers and governmental agencies. Such collaboration could reduce uncertainties through better integration of social resources and serve as a model for other electrified sectors.
Emerging high-efficiency technologies in hardware and software, exemplified by DeepSeek[43](https://www.nature.com/articles/s41893-025-01681-y#ref-CR43 “Pipe, A. & Rattner, N. How DeepSeek’s lower-power, less-data model stacks up. Wall Street Journal (16 February 2025); https://www.wsj.com/tech/ai/deepseek-ai-how-it-works-725cb464
“), may fundamentally transform AI server supply and demand. As reflected in the study’s five scenarios, these innovations may cause deviations of up to 393 million m3 in water footprints and 20 MtCO2-equivalent in emissions between minimum and maximum impact cases, underscoring the need for tailored managements. While efficiency gains may reduce cost per computing task, they risk a rebound effect, where lower costs increase total application volume. This dynamic, as reflected by the high-application scenario, may amplify total demand and complicate AI’s environmental trajectory. To address these uncertainties, we recommend government agencies work with industry to establish real-time monitoring systems, enabling timely alerts and proactive measures before considerable environmental impacts occur. Moreover, the potential increase in total computing jobs poses both challenges and opportunities, calling for ongoing enhancements in energy and water efficiency through system optimization and adoption of strategies such as SUO and ALC to manage the added workload complexity and flexibility. Therefore, we also suggest the data-centre industry establish AI-specific benchmarks for energy, water and carbon performance, which is crucial for continuous operational efficiency gains.
Methods
The methodology framework of this study aims to achieve two goals: (1) draft the energy–water–climate impacts of AI servers in the United States from 2024 to 2030 to handle the massive concerns about AI developments, and (2) identify the best and worst practices of each influencing factor to scheme the net-zero pathways for realizing water and climate targets set for 2030. Compared with many previous climate pathway studies, which often extend predictions to 2050 for better integrating climate goals, this study focuses on the period from 2024 to 2030 due to the great uncertainties surrounding the future of AI applications and hardware development. For assessing these uncertainties, scenario-based projections are first constructed to obtain potential capacity-increasing patterns of AI servers. Technology dynamics, such as SUO and ALC adoption, are defined with best, base and worst scenarios, and a similar method is employed to capture the impact of grid decarbonization and spatial distribution. The utilized models and data required during the calculation process are illustrated in the following sections. More details on model assumptions and data generation are provided in sections 1–4 of Supplementary Information.
Data description and discussion
This section provides a comprehensive overview of the data used in this study. Historical DGX (Nvidia’s high-performance AI server line) parameters were sourced from official documentation, and future scenarios were projected on the basis of historical configurations and current industry forecasts. To attain the units of AI servers, we collected the most updated industrial report data for projecting the future manufacturing capacity of CoWoS technology, which is the bottleneck for top-tier AI server production. The data resources of the preceding process have been introduced and validated in section 1 of Supplementary Information. AI server electricity usage was assessed using recent experimental data on maximum power[44](https://www.nature.com/articles/s41893-025-01681-y#ref-CR44 “Patel, P. et al. Characterizing power management opportunities for LLMs in the cloud. In Proc. 29th ACM International Conference on Architectural Support for Programming Languages and Operating Systems Vol. 3, 207–