While large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard have revolutionized the tech landscape, new research highlights their enormous water footprint.
The training process for AI models like GPT-3 and GPT-4 requires immense amounts of energy and cooling, resulting in considerable water consumption.
Researchers from the University of Colorado Riverside and the University of Texas Arlington published a pre-print paper titled “Making AI Less ‘Thirsty.'”
They estimated that GPT-3 consumed 185,000 gallons (700,000 liters) of water during its training.
The water consumption would be even higher for newer models like GPT-4, which rely on a larger set of data parameters.
According to the study, an average user’s conversation with ChatGPT is equivalent to pouring out a large bottle of fresh water.
As these AI models become more popular, their water consumption could have a significant impact on water supplies, especially amid the increasing environmental challenges in the US.
Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.
Data centers use massive amounts of water to cool down server rooms and maintain an ideal temperature for the equipment.
Cooling towers, the most common cooling solution for warehouse-scale data centers, consume a significant amount of water.
The researchers estimate around a gallon of water is consumed for every kilowatt-hour expended in an average data center.
Data centers typically rely on clean freshwater sources to avoid corrosion, bacteria growth, and to control humidity.
Researchers suggest several ways to reduce AI’s water footprint, including adjusting when and where AI models are trained.
Training models during cooler hours or in data centers with better water efficiency can help reduce water consumption.
Chatbot users can also engage with AI modules during “water-efficient hours,” similar to off-hours appliance use.
Federated learning strategies, which involve multiple users collaborating on training AI models using local devices, could also help decrease on-site water consumption.
Integrating information from electricity providers and advancements in energy storage could further aid in distributing the training load based on clean energy availability.
The researchers call for greater transparency from AI model developers and data center operators in disclosing when and where AI models are trained.
Such information would be valuable for both the research community and the general public.
Acknowledging the training process and location could also help address AI water footprint concerns.
As AI continues to advance, it is crucial for the tech industry to develop environmentally sustainable practices to minimize the water footprint of these revolutionary models.
This article follows The History of Tattoos, The Legacy of Apo Whang-Od, and Oil Pastel Tattoos. Injectable ‘smart tattoos’ could…
Discover the fascinating world of astral projection with our comprehensive guide. Learn the steps, benefits, and safety tips for an…
This article follows The Rich History of Tattoos and aims to shed even more light on this ancient form of…