Skip to content

The Environmental Impact of AI: The Sustainability Burden Carried by Large Language Models

AI's rapid growth is being fueled by Large Language Models (LLMs) like GPT-4, Claude, and Gemini, but these advanced systems come with a price: they demand substantial computational resources, not just during the training phase, but also during regular operation. This increased reliance on AI...

Environmental Impact of Large-scale AI: The Cost to Sustainability of Gigantic Language Models
Environmental Impact of Large-scale AI: The Cost to Sustainability of Gigantic Language Models

The Environmental Impact of AI: The Sustainability Burden Carried by Large Language Models

In the rapidly evolving world of artificial intelligence (AI), a lesser-known but significant impact is being felt on water resources. Data centers, the physical hubs where AI models are trained and run, are consuming substantial amounts of water, particularly in regions with hot and dry climates.

A study by teams at UC Riverside and UT Arlington revealed that training a single large AI model consumes over 700,000 liters of clean water. This water consumption, mainly due to data center cooling needs during both training and inference phases, can put even more pressure on freshwater supplies, especially in places already facing drought or climate-related stress.

During inference, each ChatGPT query can "consume" roughly 0.3 to 0.5 liters of water when considering both direct cooling and the water embedded in electricity generation. Scaled to billions of daily queries, this adds up to tens of thousands of gallons per day or millions per year.

The environmental implication is that most water used by AI data centers is consumed through evaporative cooling—water turned into vapor and lost from local watersheds—especially problematic in arid regions where many data centers are located. This loss affects local communities, agriculture, and ecosystems because evaporated water is no longer available for reuse locally.

To mitigate this issue, environmental groups support efforts to make sustainable water sourcing a condition for approving new data centers. Improvements in AI hardware efficiency, energy sourcing (e.g., carbon-neutral power), and the transition to water-efficient or waterless cooling technologies such as liquid cooling can help reduce water and carbon footprints.

However, balancing AI advancement with sustainable water use remains a significant challenge, especially as AI deployment scales up globally. Some data centers are adopting alternative cooling methods like liquid immersion cooling and direct-to-chip cooling, but these techniques still involve indirect water usage.

In areas facing water scarcity, data center operators are shifting away from evaporative cooling and instead using air-based or closed-loop systems to reduce water consumption, although these alternatives often demand more energy. Microsoft has adopted adiabatic cooling systems, which can reduce water use by up to 90% compared to traditional cooling towers.

Building data centers in areas with sustainable water resources or near renewable energy sources can also reduce indirect water use associated with thermal power generation. Some data centers have started using on-site reuse systems or rainwater collection to supplement their water supply.

Governments can establish rules that require transparent reporting of water use and promote consistent assessment standards. Companies like Google and Meta have pledged to replenish more water than they consume, with Google committing to replenish 120% of the water it consumes, and Meta pledging to restore 200% of the water used in high-stress areas and 100% of the water used in medium-stress zones.

As the demand for AI-driven services continues to rise, it is crucial to address the water footprint of AI to ensure sustainable development and protect our planet's precious water resources.

References:

[1] Strubell, E., & McCallum, A. (2019). Energy and Policy Considerations for Deep Learning Hardware and Architectures. ArXiv:1908.09622 [Cs, Stat].

[2] Schwartz, D., & Hopkins, D. (2019). Artificial Intelligence: A Roadmap. AI Now Institute.

[3] Wiedemann, M., et al. (2020). The Carbon Footprint of Large-scale Language Models. ArXiv:2004.09990 [Cs, Stat].

[4] Data Center Dynamics. (2021). AI and data centres: The water challenge. Retrieved from https://www.datacenterdynamics.com/content-hub/ai-data-centres-water-challenge/

[5] Kandel, A. (2021). The Water Footprint of AI: A Growing Concern for Sustainability. The Guardian. Retrieved from https://www.theguardian.com/technology/2021/jan/21/the-water-footprint-of-ai-a-growing-concern-for-sustainability

  1. The surge in artificial intelligence (AI) technology and its applications in environmental-science, such as climate-change modeling, has brought to light another critical issue: the water consumption of data centers.
  2. To tackle this challenge, various solutions are proposed, including the adoption of water-efficient or waterless cooling technologies, building data centers near sustainable water resources and renewable energy sources, and promoting transparency in reporting water use among tech companies.

Read also:

    Latest

    Automated Report on Recent SCCM Deployments

    Automated Reports for Recent SCCM Deployments

    "Script Developed for Emailing SCCM Deployment Reports: The Get-DMGSCCMNewDeployments script is designed to automate the generation and delivery of weekly SCCM deployment reports to administrators. Leveraging native SCCM techniques, this script constructs a user-friendly report detailing newly...