There is an important consideration missing from the conversation surrounding AI, and that’s how it will impact the environment. The rapid growth of AI has significant demands on resources like water and electricity, and it is our responsibility to understand these impacts and take proactive steps toward sustainable use.

We often say that AI computing happens in “the cloud”, but in fact AI runs on tangible servers stored at large-scale data centers. Similar data centers have housed servers for cloud computing services like search engines and social networking platforms for decades, but the rapid growth of AI is driving expansion with a focus on larger capacity and higher computational power.

Photo of servers in a data center

Image of servers in data center, “Data Center” by Sean Ellis is licensed under CC BY 2.0.

As you can imagine, building, training, and powering these AI servers requires a sizable amount of electricity. The growth of AI is exponentially increasing the energy demands of data centers, as a ChatGPT query requires ten times as much electricity as a Google search. To use a practical example, asking a ChatGPT chatbot to write a single 100-word email requires the same amount of electricity as 14 LED light bulbs being turned on for an hour.

Just like other electronic devices, AI servers generate heat as they burn electricity and must be cooled to prevent damage or malfunction. Some centers use ventilation systems to cool the system, akin to how an exhaust fan is used to cool a laptop computer, but most centers use liquid cooling systems.

As freshwater is used to cool the servers, it evaporates, thus the use of liquid cooling systems is contributing to freshwater scarcity. Data from 2021 estimates that Google servers depleted 5.2 billion gallons of freshwater in 2022; this 20% increase in water consumption compared to 2021 has been attributed to the rise of open AI. To use the same practical example as before, asking a chatbot to write a 100-word email requires enough water to fill a 16-oz water bottle in order to cool down.

The carbon footprint of AI usage must be considered as well. It is estimated that training a single AI model can emit 626,000 pounds of carbon dioxide, which is equivalent to the emissions of 300 round-trip flights between New York and San Francisco.

Avenues to improve the environmental sustainability of AI computing are being explored. One option is heat recovery, in which the heat generated by servers is redistributed to nearby community buildings. Another option is zero-water cooling, which explores approaches to reduce water loss in liquid cooling systems. In some instances, water can be continuously recycled in a closed system, thus preventing evaporation and reducing water consumption.In other instances, non-water coolants can be used in the cooling system. 

However, the best way to reduce the environmental impacts of AI may be through conservative use. For some purposes, like leveraging AI to improve healthcare or mitigate natural disasters, the benefit of using AI outweighs the cost of environmental resources. However, a lot of AI usage is for unnecessary purposes, like forming relationships with chatbots or generating photos of oneself in different time periods. These applications, which are often solely for entertainment, may not warrant the electricity usage and water consumption that they demand. 

The growth of AI is pioneering the technology industry and driving innovation, but we cannot ignore its environmental impacts. Although AI companies must acknowledge their own environmental footprint, it is also up to the users of these tools to ensure they are utilized responsibly and sustainably. And now that you’re informed, you might think twice about asking ChatGPT to write that 100-word email for you.

 

Peer Editor: Tiffany Peters

Contributors

Leave a Reply

Your email address will not be published.