Artificial intelligence’s (AI) energy and water consumption has become a growing concern in the tech industry, particularly for large-scale machine learning models and data centers. Sustainable AI focuses on making AI technology more environmentally friendly and socially responsible.
Zorina Alliata and Hara Gavriliadi gave a talk about sustainable AI at OOP Conference.
The energy and water usage depend on the particular AI system, its size, and deployment method, as Gavriliadi explained:
Estimates from Gartner suggest that AI and data centers account for 2-3% of global electricity use, which can rise dramatically in the coming years.
AI’s water usage for cooling is significant, with a single AI conversation potentially using up to 500ml of water.
This rapid growth in resource consumption highlights the need for more sustainable AI practices, energy-efficient technologies, and improved resource management in the tech sector, Gavriliadi mentioned.
Alliata mentioned that model complexity requires more computational power and energy to train and operate these models. She said the data center expansions and requirements for cooling contribute significantly to their overall energy usage and water consumption. As AI tools and applications become more integrated into everyday online experiences and business operations, the cumulative energy demand increases substantially, she added.
Sustainable AI focuses on the long-term effects of AI, including environmental and societal effects, Gavriliadi said. She mentioned that techniques such as sparse modeling, hardware optimization, and responsible AI practices are crucial in achieving this balance between technological advancement and environmental stewardship.
There are cutting-edge methods for lowering AI’s energy footprint, Alliata explained:
The development of more energy-efficient chips and cooling systems is essential for hardware optimization, as it lowers the power consumption of the actual AI infrastructure components.
We now study quantum and neuromorphic architectures, photonic systems, and high-performance computing clusters of servers that process information differently, to speed up the compute power significantly.
Simplifying the computational procedures and algorithmic advancements, such as developing more effective training and inference algorithms, can lower energy consumption, Alliata said. She mentioned algorithm enhancements such as transfer learning, which uses pre-trained models to reduce training time and, consequently, the overall energy demand, and model distillation, which shrinks the size of AI models without significantly reducing performance. Modularity is also crucial, she said; the use of interchangeable parts makes upgrades and repairs simple, prolonging hardware life and cutting down on waste.
Alliata mentioned that creating biodegradable AI with organic electronics and environmentally friendly packaging reduces waste, and incorporating green energy by using renewable sources to power AI infrastructure is crucial for sustainability in general.
Gavriliadi said there are several tools for estimating the environmental impact of AI solutions. These include carbon calculators that estimate emissions based on energy consumption and location-specific grid data, energy profilers that monitor and analyze energy consumption patterns during AI model execution, and offset estimators that calculate the number of trees needed to offset AI-related carbon emissions:
AWS offers a way to measure the carbon footprint of your workload, and optimize workloads on their platform which can lower the carbon footprint by up to 99%.
Sustainable IT requires a cultural and mindset shift, Gavriliadi said. She advised developing a strategy for AI applications across business, technology, and sustainability, to set, monitor, and assess progress with metrics to calculate carbon intensity and check power usage effectiveness, and train and educate employees on sustainable IT practices, she concluded.
InfoQ interviewed Zorina Alliata and Hara Gavriliadi about sustainable AI.
InfoQ: How much energy and water does artificial intelligence consume?
Zorina Alliata: AI training and inference calls for significant computational capability with high energy consumption. A 2019 study, for instance, calculated that training a single AI model can emit as much carbon as five cars in their lifetimes. In another study, the International Energy Agency estimated that AI training consumed as much energy as a small country.
The development of the OPT-175B model resulted in an estimated 75 tCO2e, which doubled to 150 tCO2e when including baselines and downtime.
According to carbon emissions and large neural network training, GPT-3 has used an estimated 552 tCO2e or 1,287 MWh in energy consumption. This is equivalent to the electricity consumed by 121 U.S. households in an entire year.
Hara Gavriliadi: Data centers also require a lot of water for cooling systems. According to the 2023 Amazon sustainability report, AWS data centers use 0.18 liters of water per kilowatt-hour. In a blog post about its commitment to climate-conscious data center cooling, Google said that 4.3 billion gallons of water were used worldwide in its data centers for 2021. This number reflects their whole activities, not only artificial intelligence, but also provides an idea of the magnitude of the issue.
InfoQ: What’s your advice to companies that want to work toward sustainable IT?
Gavriliadi: Companies that want to work toward sustainable IT should first align their IT purchasing with their sustainability goals. They should then focus on measuring, predicting, and reducing carbon emissions associated with their IT infrastructure and cloud workloads. Companies should also implement environmental best practices for cloud computing.
At AWS, using the “Sustainability Pillar” of the Well-Architected Framework can help guide these efforts. Companies can also benefit from using digital tools and data analytics to analyze and optimize their energy consumption.