Companies should focus on reducing their carbon footprint and running their data centers on renewable energy. (Photo by Tom Lee/Construction Photography/Avalon/Getty Images)
Artificial intelligence (AI) relies heavily on computing power, and the complexity of machine learning or deep learning models requires significant computing resources. Given the significant energy requirements of modern hardware, this translates into extremely high power consumption.
Most AI research today focuses on achieving the highest levels of accuracy, with little attention to computational or energy efficiency. Leaderboards in the AI community track which system performs best at tasks such as image recognition or language understanding, where accuracy is prioritized above all else.
Deep learning, based on neural networks with billions of parameters, is inherently computationally intensive. The more complex the network, the greater the need for powerful computing power and longer training times.
Canadian researchers Victor Schmidt et al. report that state-of-the-art neural architectures are often trained on multiple GPUs for weeks or months to surpass previous performance.
The costs of AI
AI is expensive; Research by OpenAI researchers Dario Amodei and Danny Hernandez shows that since 2012, the computing power used for deep learning research has doubled every 3.4 months. This amounts to a 300,000-fold increase between 2012 and 2018, far exceeding Moore’s law, which states that processing power doubles every two years.
As AI use grows, especially with consumer applications like ChatGPT, energy consumption continues to escalate.
But, and this is good news, as the world focuses on climate change, AI researchers are also starting to recognize its carbon costs. A study by Roy Schwartz et al. of the Allen Institute for AI questions whether efficiency, along with accuracy, should become a priority. AI models require enormous amounts of computing power for data processing training and experimentation, which increases CO2 emissions.
Similarly, the University of Massachusetts (Strubell et al, 2019) highlighted the impact of AI on the environment, analyzing the computational demands of machine translation neural architecture searches.
That’s why it was predicted five years ago that the CO2 cost of training such models would be 28,4019.13 kg of CO₂, equivalent to 125 round trips from New York to Beijing. As AI’s energy needs continue to grow, it is critical to consider sustainability in addition to utility.
The good news
Fortunately, AI can help in our global quest to reduce greenhouse gas emissions. A 2019 study from Microsoft and PwC predicted that responsible use of AI could reduce global greenhouse gas emissions by 4% (2.4 gigatons) by 2030.
AI is already being used to optimize energy consumption in industrial and residential sectors, predict supply and demand, manage autonomous transportation and reduce carbon footprint. For example, Google has improved the energy efficiency of its data centers by 35% using machine learning technology developed by DeepMind.
AI also helps minimize waste in green energy production, predicts solar, wind and hydropower output, and optimizes water use in residential, agricultural and manufacturing areas.
In addition, algorithms have improved agricultural processes such as precision farming, ensuring crops are picked at the right time and water is used efficiently.
The ecological responsibility of AI
According to the Shift Project, the information and communications technology (ICT) sector is responsible for around 4% of global CO2 emissions, with its contribution to greenhouse gas emissions exceeding that of the aviation industry by 60%.
As more companies adopt AI to drive innovation, the demand for cloud-optimized data center facilities will increase. By 2025, data centers will be responsible for 33% of global ICT electricity consumption.
To minimize their carbon footprint, companies must ensure their data centers are equipped to efficiently handle high-density computing needs. Unfortunately, according to ScienceDirect, as many as 61% of systems managed by enterprise data centers are running at low efficiency.
Furthermore, it is crucial that data centers are powered by renewable energy. Housing AI in fossil fuel-powered facilities could negate energy efficiency efforts. That’s why it’s important that companies verify the green credentials of their cloud provider.
Location is another factor in ensuring sustainable AI. Cooling data centers is expensive, especially in warmer climates, and more than 80% of the hardware doesn’t need to be close to the end user in terms of latency.
For example, technology giants such as Google are investing in data centers in Scandinavian countries for better energy efficiency. Furthermore, in countries such as Iceland, natural cooling reduces energy consumption, with renewable geothermal and hydropower plants providing cleaner operations.
The future
The future of AI must focus on sustainability. The World Economic Forum proposes a four-step process to balance the benefits of AI with the environmental impact:
- Select the right use case: Not all AI optimizations lead to significant CO2 reductions. Organizations should prioritize processes that can be meaningfully optimized by AI, especially for sustainability use cases.
- Choose the right algorithm: The energy consumption of an AI system largely depends on the algorithm used. By selecting the most efficient algorithm, organizations can significantly reduce training time and energy consumption.
- Predict and monitor CO2 results: good intentions alone are not enough. AI implementers should include carbon footprint estimates in cost-benefit analyzes and use sustainability as a key performance indicator for AI projects.
- Offset the footprint with renewable energy: Organizations should use green energy sources to power AI models.
Ben Selier is vice president of Secure Power, English-speaking Africa at Schneider Electric.