Generative artificial intelligence (AI) tools like OpenAI’s ChatGPT and Google’s Bard are gaining popularity, but they come at an eye-watering (pun intended) environmental cost. Microsoft, for instance, witnessed a steep 34 per cent surge in water consumption from 2021 to 2022, amounting to almost 1.7 billion gallons. This increase is closely tied to the company’s AI research efforts. Generative AI is known to be water-intensive, requiring approximately half a litre of water for every five-to-50 prompts. As per the 2023 Environmental Report from Google, the company’s water consumption amounted to a staggering 5.6 billion gallons in the previous year. This signifies a notable 20 per cent upswing from its water usage in 2021, a surge largely tied to the expanding AI initiatives at Google.
Additionally, AI models also demand a substantial amount of electricity. As the general public embraces generative AI tools, concerns about their environmental impact are emerging.
How much water is generative AI consuming regularly?
Although there are limited data on AI and sustainability (as of yet), a recent study by researchers from the University of California, Riverside, US, and the University of Texas, Arlington, US, sheds light on the water footprint of AI models like OpenAI’s GPT-3 and GPT-4.
For example, during GPT-3 training in its data centres, Microsoft was estimated to have utilised around 700,000 litres of fresh water. This amount is equivalent to the water needed to fill a nuclear reactor’s cooling tower or produce 370 BMW cars or 320 Tesla vehicles, according to the study.
Based on these findings, ChatGPT is estimated to require 500ml of water for every five to 50 questions answered.
Microsoft’s data centre water usage increased by 34 per cent from 2021 to 2022, consuming over 1.7 billion gallons, equivalent to filling more than 2,500 Olympic-sized swimming pools. Google also reported a 20 per cent increase in its water consumption during the same period.
The surge in water consumption is mainly attributed to the rising popularity of AI models. Shaolei Ren, a scientist at the University of California at Riverside, explained that a significant portion of the increase is due to the growing AI workloads. He emphasised the need for awareness about the resource usage underlying these AI models.
In a forthcoming paper, Ren estimated that OpenAI’s GPT-3 model, which powers ChatGPT, consumed over 85,000 gallons of water during its training. Each discussion involving 25 to 50 questions in ChatGPT likely consumed the equivalent of a 500ml bottle of water. These estimates encompass indirect water usage by power plants providing energy to the data centres.
While Google maintained steady water usage at its Oregon facility, its Council Bluffs data centres in Iowa consumed more potable water than any other data centre. Factors such as location, season, and cooling technology influence the amount of water used by data centres.
Why does generative AI need so much water?
AI models like GPT-3 and GPT-4 reside in data centres, large physical warehouses housing computational servers. These servers analyse patterns and connections across massive datasets, consuming significant energy in the process, whether from electricity, coal, nuclear power, or natural gas.
The training process incurs substantial energy costs, which are then dissipated as heat.
On-site water is used to regulate temperatures across the infrastructure. Fresh water is vital for proper humidity control and to prevent issues caused by saltwater, such as corrosion and bacterial growth.
However, the lack of transparency regarding water consumption numbers related to AI training makes it challenging to determine the actual footprint. Despite concerns about the carbon footprint of generative AI, addressing the water footprint is equally crucial for achieving truly sustainable AI.
How can we reduce generative AI’s ‘thirst’ for water?
Possible solutions to this issue include developing more efficient algorithms and hardware to reduce the energy requirements of AI models.
Some possible strategies to minimise AI’s water footprint:
Utilise Renewable Energy Sources: Leveraging wind or solar power to generate electricity significantly diminishes the amount of water required in AI operations.
Implement Water-Efficient Cooling Systems: Data centres, where AI systems operate, often use water-based cooling methods. Adopting water-efficient cooling technologies like air cooling or direct-to-chip liquid cooling can effectively decrease water usage.
Develop Water-Efficient Algorithms: Designing AI algorithms to be more water-efficient by minimising computational power needs or optimising processes to use less water-intensive methods is crucial.
Prolong Hardware Lifespan: Extending the life expectancy of hardware can mitigate water usage in its production. Designing durable and upgradeable hardware helps reduce the frequency of replacements.
Advocate for Responsible Water Management: Encouraging responsible water management practices within data centres and AI enterprises can contribute to water footprint reduction. This involves wastewater recycling, rainwater harvesting systems, and water-efficient landscaping practices.
Enforce Policies for Water Footprint Reduction: The adoption of policies and regulations that incentivise or mandate the reduction of AI’s water footprint by establishing standards, targets, or taxes is essential. This regulatory approach can drive significant positive change in AI water usage.
Microsoft has explored innovative ideas, such as locating data centres in the ocean and utilising ocean water for cooling, aiming to mitigate the environmental impact of AI models.
The water consumption of generative AI models is indeed a significant concern alongside their energy consumption. Increased awareness, research into more efficient technologies, and a commitment to sustainability are key to mitigating the environmental impact of AI models.