Artificial intelligence programs consume a large amount of water

Cloud data processing centers face high water costs due to power and cooling.

Share

The public has been paying close attention to the growing carbon footprint of artificial intelligence (AI) models, huge ones like GPT-3 and GPT-4, but their water footprint has primarily remained unseen. 

By addressing their water footprint, AI models must take social responsibility and set an example for others. A new study analyses the distinct spatial-temporal variations of runtime water efficiency for AI models. It offers a sound methodology for estimating the fine-grained water footprint of AI models. 

Every time they execute a ChatGPT artificial intelligence query, you consume a small amount of a rapidly depleting resource: fresh water. Run 20 to 50 questions, and nearly a half liter, or 17 ounces, of fresh water from our overloaded reservoirs is wasted as steam emissions. 

A new study conducted by the University of California, Riverside, analyzed the water footprint of AI searches that rely on cloud computations performed in racks of servers in warehouse-sized data processing centers.

Google’s data centers in the United States alone required an estimated 12.7 billion liters of fresh water in 2021 to keep their systems cool. 

According to Shoalei Ren, an associate professor of electrical and computer engineering and the study’s corresponding author, data processing centers consume large amounts of water in two ways. First, they draw electricity from power plants that use large cooling towers that convert water into steam emitted into the atmosphere. 

Second, because electricity flowing through semiconductors continuously generates heat, the hundreds of thousands of servers in data centers must be kept cool. This requires cooling systems, often linked to cooling towers, which consume water by transforming it into steam.

Ren said, “The cooling tower is an open loop, and that’s where the water will evaporate and remove the heat from the data center to the environment.” 

It is critical to address water use from Artificial intelligence because it is a rapidly expanding component of computer processing demands. 

According to the article, a two-week training for the GPT-3 AI program in Microsoft’s cutting-edge U.S. data centers consumed approximately 700,000 liters of fresh water, nearly the same amount needed to construct around 370 BMW cars or 320 Tesla electric vehicles. 

If the training had occurred in Microsoft’s less efficient data centers in Asia, the water consumption would have been triple. Among other things, car manufacturing necessitates a number of washing processes to remove paint particles and pollutants. 

Training AI models during colder hours, when less water is lost to evaporation, is a quick and efficient way to prevent wasteful water use. 

Researcher said. “AI training is like a big, very lawn and needs lots of water for cooling. We don’t want to water our lawns during the noon, so let’s not water our AI (at) noon either.” 

This may be incompatible with carbon-efficient scheduling, which prefers to follow the sun for clean solar energy.

He said, “We can’t shift cooler weather to noon, but we can store solar energy, use it later, and still be `green.” 

It is essential to detect and address the AI model’s hidden water footprint in light of the growing freshwater scarcity situation, increasing extended droughts, and rapidly aging public water infrastructure.

Journal Reference:

  1. Li, P., Yang, J., Islam,etal. Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models. arXiv DOI: 10.48550/arXiv.2304.03271

Trending