Follow Mint Lounge

Latest Issue

Home > Smart Living> Innovation > We need to talk about the environmental impact of AI models

We need to talk about the environmental impact of AI models

A new study highlights the enormous water footprint of AI models such as ChatGPT and the need for tech giants to take social responsibility

The companies heavily investing in AI haven’t yet addressed the social responsibility that come with it.
The companies heavily investing in AI haven’t yet addressed the social responsibility that come with it. ((REUTERS/Dado Ruvic/Illustration/File Photo))

Listen to this article

While there is a lot of talk about artificial intelligence (AI) these days, what’s missing from these exciting conversations about the future is how AI chatbots could threaten it. While tech giants are engaged in a race to roll out the most advanced AI-powered updates, the environmental impact of these AI models has not received the attention that it should. 

In a landmark report, Turning the Tide, released in March, experts warned about an imminent water crisis, stating that by 2030 the demand for freshwater is expected to outstrip its supply by 40%. This is one of many important threats that the planet is currently facing. However, as climate activists around the world are calling attention to depleting water resources, companies investing heavily in AI haven’t yet addressed the enormous water footprint of AI. 

Also read: How to defuse the climate change time bomb

A recent study titled Making AI Less Thirsty, conducted by researchers from the University of Colorado Riverside and the University of Texas Arlington, highlighted the water footprint of AI models. For instance, Open AI’s ChatGPT is estimated to consume about a 500ml bottle of water to answer about 20 to 50 questions, depending on when and where it is deployed. Considering that ChatGPT is currently used by millions across the globe, the combined water footprint could be massive. 

AI models such as ChatGPT require a vast amount of energy to train and run. In 2019, researchers from the University of Massachusetts, Amherst, conducted a life-cycle assessment for training several common large AI models. They found that the process could emit more than 626,000 pounds of carbon dioxide equivalent. To put this in perspective, it is five times the emission of an average American car, according to an article in MIT Technology Review. The researchers mainly examined the training process for natural-language processing (NLP), the subfield of AI that focuses on teaching machines to handle human language.

In 2023, the field of NLP has seen noteworthy advancements such as GPT-3, GPT-3.5 and GPT-4. However, their water footprint has remained under the radar. A recent paper showed that training GPT-3 in Microsoft’s U.S. data centres can require direct consumption of 700,000 litres of clean freshwater and this would have increased three-fold if training were done in Microsoft’s Asian data centres. Earlier this year, Microsoft partnered with OpenAI to integrate ChatGPT into its services. ChatGPT, which initially ran on GPT-3.5, recently transitioned to GPT-4.  

Large language models such as the GPTs, are trained and deployed on servers housed inside warehouse-scale data centres, their physical “homes”, especially the larger ones such as GPT-3 and GPT-4 which are energy-intensive, as the paper explains. They take up about 2% of the electricity usage worldwide. Almost all the server energy is converted into heat which must be expelled from the data centre server room to avoid overheating. 

In most data centres there are two types of cooling systems — air-cooled and water-cooled. The former generally uses fans to circulate air around the servers and equipment and cool the centre. In water-cooled systems, water is used to absorb the heat and prevent overheating of the equipment. The absorbed heat is then sent to an external cooling tower or the chiller. 

Water-cooled systems consume more water but are also more efficient. According to the 2023 study, the most common cooling solution for warehouse-scale data centres that leading companies such as Google and Microsoft use are cooling towers. 

For instance, self-owned data centres of Google – which released the AI-powered Google Bard earlier this year to compete with ChatGPT – in the US consumed 12.7 billion litres of freshwater for on-site cooling in 2021 and 90% of this was potable water, according to the paper. 

Furthermore, the water footprint of all US data centres in 2014 was estimated to be 626 billion litres. Even with recirculation, a significant amount of freshwater is used in this process. Furthermore, on-site water is clean freshwater which is used to avoid corrosion, clogged water pipes, and bacterial growth, the paper explains. 

There is also indirect water consumption where huge amounts of water are used to generate electricity (non-renewable) for the data centres. However, the amount of water consumed differs depending on the cooling technologies employed and different power generation systems (e.g., nuclear, coal and natural gas). 

As our reliance on AI increases, it’s important to take note of its environmental impact and adopt a carbon-aware approach. In the 2023 study, researchers showed that the time and place for training the AI models makes a difference and there is a need to further understand the efficiency of water-saving approaches. 

Currently, as these environmental concerns are less-talked about in the AI space, there is a lack of transparency and the general public remains mostly unaware of how they can use AI models more sustainably. For instance, using the AI interfaces during water-efficient hours could help, as the paper suggests.  

Also read: Apple says ‘Hello Mumbai’ with its flagship BKC store

Next Story