Data centers are insatiable monsters that those in charge must feed. OpenAI, Meta, Microsoft, xAI, Anthropic and Google are burning money setting up colossal data centers for training and managing artificial intelligence. But these installations are not expensive to set up: they are also expensive to maintain. They need a considerable amount of energy to operate, and Google has just received a boost in renewables.
All thanks to a direct connection to the largest system in the United States.
Renewables to power AI. Google and TotalEnergies have just signed a 15-year power purchase agreement. The contract stipulates that the energy company will deliver 1.5 TWh of electricity from its Montpelier solar plant, in Ohio, to Google. The plant is still under construction and they estimate that it will have a capacity of 49 MW, but the most important thing is that it will be connected directly to the PJM system.
It is the largest network operator in the United States. It covers 13 states and data centers are representing a significant portion of the operator’s pie: in its last annual auction, the load of these facilities boosted PJM’s capacity sales by $7.3 billion, an 82% increase.
Astronomical needs. In the statement from TotalEnergies, the company said that this agreement illustrates its ability to meet the growing energy demands of major technology companies. The problem is that it is not enough. If we focus on Google, the consumption of its data centers was 30.8 million megawatt hours of electricity. The company has been focused on AI for years, but the recent boom has caused it to double what its centers consumed in 2020 (14.4 million MWh).
Currently, data centers are estimated to account for 95.8% of Google’s total electricity budget. But it’s not just Google: the International Energy Agency estimates that global data centers consumed 415 TWh last year, representing approximately 1.5% of global electricity consumption.
It seems small as a percentage, but Spain consumed 231,808 GWh, or 231 TWh, in 2024. Only the data centers of a handful of companies consumed twice as much as an entire country. And the estimate is that this data center consumption will double by 2030, reaching 945 TWh.
Renewables are not enough. Now, although renewables are a support for the total energy required by data centers, solar and wind energy have two limitations: intermittency and variability. Generation depends on weather conditions and time of day, meaning it fluctuates dramatically even throughout the same day. This instability clashes head-on with the high reliability and availability requirements of data centers.
These are facilities that must operate continuously and cannot handle unforeseeable outages or drops in supply, since AI or cloud storage would suffer the consequences. These renewables require backup batteries, but it is complicated and expensive to have such a large number of batteries just to power data centers.
Pulling the gas and looking at the nuclear. That’s where other sources come into play. On the one hand, nuclear. In October 2024, Google signed the world’s first corporate agreement to purchase nuclear energy from SMR reactors. The first will come into operation in 230 and it is expected that, together, they will be able to satisfy the technology company with 500 MW of capacity by 2035.
On the other hand, natural gas. In October of this year, the Broadwing Energy Center project began, a new natural gas power plant that will have a capacity of 400 MW and is scheduled to come into play at the end of 2029.
Decarbonization and pressure. And the big question is… doesn’t the use of gas for AI clash with the technology companies’ objectives of achieving decarbonization percentages for both 2030 and 2050? We have already seen that oil companies have been getting off the renewable bandwagon because they have seen that fossil fuels are still relevant in the technology industry, but in the case of Google, they are relying on the fact that projects such as the Broadwing Energy Center will have CCS systems.
This means that it will have a carbon capture system that will be able to permanently “sequester” 90% of the emissions. It means burying the problem, literally, since the CO₂ will be stored a mile underground. In 2020, before the AI boom, the company set a goal of operating on carbon-free energy 24 hours a day, seven days a week by 2030.
It will be interesting to see how they plan to offset these emissions thanks to renewables, but the IAE estimates that the demand for data centers will not stop growing in the short term and that adds another problem: greater pressure on the electrical grid that is added as another element to manage. Because the big underlying problem is that the demand for energy is growing at a faster rate than the capacity to generate new electricity, and it is something that has an impact on the bills of companies, but also on that of homes.
Images | Unsplash, Google Data Center
In WorldOfSoftware | China does not have a spending problem with AI. What it has is a huge income gap compared to its main rival
