By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
News

OpenAI will not disclose GPT-5’s energy use. It could be higher than past models

News Room
Last updated: 2025/08/09 at 3:13 PM
News Room Published 9 August 2025
Share
SHARE

In mid-2023, if a user asked OpenAI’s ChatGPT for a recipe for artichoke pasta or instructions on how to make a ritual offering to the ancient Canaanite deity Moloch, its response might have taken – very roughly – 2 watt-hours, or about as much electricity as an incandescent bulb consumes in 2 minutes.

OpenAI released a model on Thursday that will underpin the popular chatbot – GPT-5. Ask that version of the AI for an artichoke recipe, and the same amount of pasta-related text could take several times – even 20 times – that amount of energy, experts say.

As it rolled out GPT-5, the company highlighted the model’s breakthrough capabilities: its ability to create websites, answer PhD-level science questions, and reason through difficult problems.

But experts who have spent the past years working to benchmark the energy and resource usage of AI models say those new powers come at a cost: a response from GPT-5 may take a significantly larger amount of energy than a response from previous versions of ChatGPT.

OpenAI, like most of its competitors, has released no official information on the power usage of its models since GPT-3, which came out in 2020. Sam Altman, its CEO, tossed out some numbers on ChatGPT’s resource consumption on his blog this June. However, these figures, 0.34 watt-hours and 0.000085 gallons of water per query, do not refer to a specific model and have no supporting documentation.

“A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4,” said Rakesh Kumar, a professor at the University of Illinois, currently working on the energy consumption of computation and AI models.

The day GPT-5 was released, researchers at the University of Rhode Island’s AI lab found that the model can use up to 40 watt-hours of electricity to generate a medium-length response of about 1,000 tokens, which are the building blocks of text for an AI model and are approximately equivalent to words.

A dashboard they put up on Friday indicates GPT-5’s average energy consumption for a medium-length response is just over 18 watt-hours, a figure that is higher than all other models they benchmark except for OpenAI’s o3 reasoning model, released in April, and R1, made by the Chinese AI firm Deepseek.

This is “significantly more energy than GPT-4o”, the previous model from OpenAI, said Nidhal Jegham, a researcher in the group.

Eighteen watt-hours would correspond to burning that incandescent bulb for 18 minutes. Given recent reports that ChatGPT handles 2.5bn requests a day, the total consumption of GPT-5 could reach the daily electricity demand of 1.5m US homes.

As large as these numbers are, researchers in the field say they align with their broad expectations for GPT-5’s energy consumption, given that GPT-5 is believed to be several times larger than OpenAI’s previous models. OpenAI has not released the parameter counts – which determine a model’s size – for any of its models since GPT-3, which had 175bn parameters.

A disclosure this summer from the French AI company Mistral finds a “strong correlation” between a model’s size and its energy consumption, based on Mistral’s study of its in-house systems.

“Based on the model size, the amount of resources [used by GPT-5] should be orders of magnitude higher than that for GPT-3,” said Shaolei Ren, a professor at the University of California, Riverside who studies the resource footprint of AI.

Benchmarking AI power usage

GPT-4 was widely believed to be 10 times the size of GPT-3. Jegham, Kumar, Ren and others say that GPT-5 is likely to be significantly larger than GPT-4.

Leading AI companies like OpenAI believe that extremely large models may be necessary to achieve AGI, that is, an AI system capable of doing humans’ jobs. Altman has argued strongly for this view, writing in February: “It appears that you can spend arbitrary amounts of money and get continuous and predictable gains,” though he said GPT-5 did not surpass human intelligence.

skip past newsletter promotion

A weekly dive in to how technology is shaping our lives

Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use Google reCaptcha to protect our website and the Google Privacy Policy and Terms of Service apply.

after newsletter promotion

In its benchmarking study in July, which looked at the power consumption, water usage and carbon emissions for Mistral’s Le Chat bot, the startup found a one-to-one relationship between a model’s size and its resource consumption, writing: “A model 10 times bigger will generate impacts one order of magnitude larger than a smaller model for the same amount of generated tokens.”

Jegham, Kumar and Ren said that while GPT-5’s scale is significant, there are probably other factors that will come into play in determining its resource consumption. GPT-5 is deployed on more efficient hardware than some previous models. GPT-5 appears to use a “mixture-of-experts” architecture, which means that it is streamlined so that not all of its parameters are activated when responding to a query, a construction which will likely cut its energy consumption.

On the other hand, GPT-5 is also a reasoning model, and works in video and images as well as text, which likely makes its energy footprint far greater than text-only operations, both Ren and Kumar say – especially as the reasoning mode means that the model will compute for a longer time before responding to a query.

“If you use the reasoning mode, the amount of resources you spend for getting the same answer will likely be several times higher, five to 10,” said Ren.

Hidden information

In order to calculate an AI model’s resource consumption, the group at the University of Rhode Island multiplied the average time that model takes to respond to a query – be it for a pasta recipe or an offering to Moloch – by the model’s average power draw during its operation.

Estimating a model’s power draw was “a lot of work”, said Abdeltawab Hendawi, a professor of data science at the University of Rhode Island. The group struggled to find information on how different models are deployed within data centers. Their final paper contains estimates for which chips are used for a given model, and how different queries are parceled out between different chips in a datacenter.

Altman’s June blog post confirmed their findings. The figure he gave for ChatGPT’s energy consumption per query, 0.34 watt-hours per query, closely matches what the group found for GPT-4o.

Hendawi, Jegham and others in their group said that their findings underscored the need for more transparency from AI companies as they release ever-larger models.

“It’s more critical than ever to address AI’s true environmental cost,” said Marwan Abdelatti, a professor at URI. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Major bank gives $600 to customers in 10 states – take exact steps by August 15
Next Article I just want a new Pixel Stand with my Pixel 10, Google
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

The Best Cheap Keyboards for 2025
News
The Soundcore V20i earbuds are at their lowest-ever price — save over 50% right now
News
Apple Watch 11 — here’s the 5 biggest rumored upgrades
News
Hands-On: Foodllama For iPhone Simplifies Food Tracking With The Power Of AI – BGR
News

You Might also Like

News

The Best Cheap Keyboards for 2025

24 Min Read
News

The Soundcore V20i earbuds are at their lowest-ever price — save over 50% right now

2 Min Read
News

Apple Watch 11 — here’s the 5 biggest rumored upgrades

7 Min Read
News

Hands-On: Foodllama For iPhone Simplifies Food Tracking With The Power Of AI – BGR

4 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?