Uncategorized

ChatGPT uses up to 1 liter of water for 50 questions

Training and running AI chatbots in data centers consumes vast amounts of energy and water. The tech companies like Google, Microsoft or OpenAI do not like to be looked at in the cards.




Training AI is power and water intensive

A few weeks ago, an independent analysis came to the conclusion that the training of GPT-3, the basis of ChatGPT, is said to have consumed almost 1,300 megawatt hours of electricity. This corresponds to the annual consumption of almost 200 people.

Four researchers from the Universities of Colorado Riverside and Texas Arlington have now specifically examined water consumption when training artificial intelligence.




Cooling of the data centers with drinking water

Drinking water is usually used to cool the data centers, according to the criticism. This is to prevent corrosion and bacteria in the systems.

The Researcher calculations according to the GPT-3 training alone consumed 700,000 liters of water. This roughly corresponds to the daily drinking water consumption of over 5,000 people.




Calculations are based on estimates

However, as with electricity consumption, the researchers had to base their calculations on estimates. Because OpenAI has not yet revealed how long GPT-3 was trained.

Microsoft, in turn, has stated that the group’s new supercomputer used for AI development consists of 10,000 graphics cards and 285,000 processor cores, as Futurezone writes.

According to the researchers, this corresponds to the water consumption required to manufacture 370 BMWs or 320 Teslas.




Operating the AI ​​chatbots also consumes resources

However, the end of the training does not stop with the high water consumption for cooling the data centers. The operation of ChatGPT also consumes a lot of resources. According to the researchers, a conversation with 25 to 50 individual questions requires around 500 milliliters of water.

In addition, the assumption is that the training of GPT-4, the latest ChatGPT base model, required even more water. Global warming is also taking its toll on the data centers – and is also increasing the need for water.




Data centers in Texas: Bard consumes more

And: Google’s chatbot Bard and the model behind it, Lamda, could have used significantly more water since some of Google’s data centers are located in hot areas like Texas. The researchers are assuming several million liters of water here.

Incidentally, the researchers did not calculate the water that is fed back into rivers or lakes after the cooling process. Rather, the calculations are based on drinking water that evaporates in cooling towers.




Researcher: Don’t ignore AI water consumption

“The water footprint of AI models can no longer be ignored,” the researchers demand. “Water use must be addressed as a priority as part of the collective effort to address global water challenges.”

Dall E Mini

A first step is for the tech companies to indicate the water consumption of their AI systems. In addition, the data centers required for AI training should primarily be operated in areas where it is not so hot.

Almost finished!

Please click on the link in the confirmation email to complete your registration.

Would you like more information about the newsletter? Find out more now

Leave a Reply

Your email address will not be published. Required fields are marked *