By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: The environmental impact of ChatGPT
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > The environmental impact of ChatGPT
News

The environmental impact of ChatGPT

News Room
Last updated: 2025/10/13 at 11:12 AM
News Room Published 13 October 2025
Share
SHARE

Why is ChatGPT bad for the environment?

It is often said that “AI lives on the cloud”, but that phrase makes ChatGPT look like an abstract, almost supernatural entity — similar to the Genie in a bottle — which could not be further from the truth.

As tech executive Brian Greenberg once pointed out, “the cloud is just someone else’s computer,” which brings us back down to earth where we have to face the challenges that come with physical reality — from the energy and water consumption to the carbon footprint of training and running large language models (LLMs).

Unlike the Genie who can live for ever with no food or drink, generative artificial intelligence (GenAI) represents a large slice of pie in the planet’s resources.

When talking to an AI chatbot on our phone or laptop, many of us probably do not realise that all our messages are being processed by a remote supercomputer living in a data centre, which needs a lot of energy to run and water to keep cool.

In addition to the resources needed to address billions of user queries every day, we must not forget about how much water and energy OpenAI used to give birth to what is now globally known as ChatGPT.

How much electricity does ChatGPT use?

What distinguishes GenAI from other online tools is its incredible power density, which has driven a massive expansion in data centre construction globally.

While this is just an estimate, the power requirements for data centres in North America alone nearly doubled in a single year after ChatGPT was released — from 2,688 megawatts (MW) at the end of 2022 to 5,341 MW at the end of 2023, as reported by the Massachusetts Institute of Technology (MIT).

Globally, data centre annual electricity consumption reached 460 terawatt-hours (TWh) in 2022, and this demand is expected to more than double by 2026.

Because the speed of data centre construction is outstripping the development of renewable energy infrastructure, these facilities are largely powered by fossil fuels such as coal and gas.

The growth of AI is therefore directly linked to an increase in greenhouse gas emissions, which are the main contributor to climate change.

Based on a study conducted by The Washington Post and the University of California, Business Energy UK calculated that ChatGPT may be using around 40 million kilowatt-hours (kWh) per day to serve users all around the world.

This amount of electricity is enough to power all 3.5 million households in London.

Annually, the 117 lowest-consumption countries each consume less energy than OpenAI’s chatbot, based on the assumptions.

Why does ChatGPT use water and how much?

Besides electricity, data centres also have an insatiable thirst for freshwater.

The thousands of servers running 24/7 generate a tremendous amount of heat, and if not properly cooled, they will malfunction or even pose safety risks.

Engineering professor Dereje Agonafer estimates that cooling systems account for as much as 40% of such facility’s total energy demand.

Many of the most common and cost-effective cooling methods rely on evaporation, which is a process that consumes vast quantities of water and turns it into steam to dissipate heat.

This puts pressure on local water supplies, especially in dry regions like Arizona, Oregon and Texas where tech companies are drawn to by cheap land and energy.

This practice has led to direct conflicts with local communities and agriculture companies that were already struggling with water shortages.

For example, a Google data centre in The Dalles was found to be consuming 20% of the entire city’s drinking water supply, creating competition with local farms and orchards.

While the exact location of OpenAI’s servers is unknown, the company’s AI models are largely hosted by Microsoft’s Azure platform, which has a network of data centres across the world, including fragile desert ecosystems in Texas and Arizona.

In terms of numbers, Business Energy UK calculated that ChatGPT needs about 150 million litres of water a day to keep going.

This amount equals to the daily water usage of one million Britons — the entirety of Liverpool.

However, a regular user in the UK will unlikely be aware that their conversation with an AI chatbot is causing problems in a draught-stricken area on the other side of the world, which makes it an isolated and neglected issue.

Is ChatGPT’s hardware bad for the environment?

The operational energy and water consumption of LLMs represents only one part of their total environmental impact.

The entire lifecycle of the specialised equipment that makes GenAI possible, from the extraction of raw materials to the manufacturing of components and the eventual disposal, also carries a heavy environmental cost.

LLMs rely on Graphics Processing Units (GPUs), with industry leader NVIDIA being the primary supplier for most advanced AI systems.

Holistic AI Team estimates that running a model on the scale of ChatGPT requires thousands of these powerful chips operating in concert.

The manufacturing process begins with the mining of various minerals and rare earth elements.

The fabrication of the silicon wafers and micropchips themselves also involves a lot of energy, water and a range of toxic chemicals.

The AI industry’s main metric for progress has historically been the model size, where the number of parameters shows how “smart” the model is.

For example, the original ChatGPT, running on GPT-3.5, had 175 billion parameters, while its successor GPT-4 is rumoured to be around 1.8 trillion parameters, so nearly ten times bigger.

As tech companies release new, more powerful and larger models every few months, they also need to replace their GPUs more and more frequently, which contributes to a growing global problem of electronic waste, also known as e-waste.

Is using ChatGPT worse for the environment than its training?

Before an AI model is introduced to the public, it goes through a lot of training, where the LLM learns to process data and generate relevant output for the user.

This process is incredibly energy-intensive, according to the Canadian Institute for Advanced Research (CIFAR), as a large number of GPUs have to run for weeks or even months to train a model like ChatGPT.

In a 2021 research paper, scientists from Google and the University of California, Berkeley estimated the process to train OpenAI’s GPT-3 alone consumed 1,287 MWh of electricity, which is enough to power about 120 average homes for a year, generating about 552 tons of carbon dioxide.

In terms of water, a study conducted by the University of California, Riverside in 2023 estimated that training GPT-3 in Microsoft’s US data centres could consume a total of 5.4 million litres of water.

This all represents a massive, one-time expenditure of resources, but Holistic AI suggests that the ongoing, cumulative impact of using AI models is the dominant environmental factor.

How much energy and water does ChatGPT use for a single query?

Every time you ask ChatGPT a question, the model performs a complex calculation to generate response.

Sam Altman, CEO of OpenAI, has recently revealed on its blog that an average user query uses about 0.34 Wh and around 0.39 millilitres (ml) of water — the equivalent to what a high-efficiency lightbulb uses in a couple of minutes and roughly one-fifteenth of a teaspoon.

While the resource consumption of a single query appears minor on its own, the global usage of the platform represents a growing burden on energy and water sources.

As of August 2025, ChatGPT has 800 million weekly active users, processing more than one billion queries every day.

This suggests that the total daily energy consumption could be around 340 MWh, so about a third of the initial model’s training requirements.

Kasper Groes Albin Ludvigsen, board chair at the Danish Data Science Community (DDSC), has calculated that the power used to answer users’ questions would surpass the amount used in the training stage of GPT-4 within 150-200 days.

However, the number Altman revealed on his blog came out in an informal way, without any supporting documentation or a definition of what an “average query” means.

It is also unclear whether this is an average number for all ChatGPT models, or if it applies to a specific one — whether it be the original GPT-3.5 or OpenAI’s most advanced GPT-5.

A recent analysis by Epoch AI estimates that while a simple query might indeed use 0.3 Wh, a longer article involving a 10,000-token input could consume around 2.5 Wh.

For a maximum input of 100,000 tokens, which roughly equals to 200 pages, the energy could skyrocket to even 40 Wh per query.

It is also worth noting that generating text content is far less intensive than generating multimedia content.

A 2024 study by researchers from Hugging Face and Carnegie Mellon University showed that using an AI model to generate an image takes as much energy as fully charging your smartphone.

In terms of water, the academic view is even more distinct from Atlman’s claims.

Researchers at the University of California, Riverside and several other studies estimate that ChatGPT consumes around 10 ml of water per query on average — just under a tablespoon.

The number of user queries is only expected to grow as tech companies are incorporating AI models into more and more of their products, but the real usage and environmental impact of these integrations is harder to measure.

This is a classic example of the Jevons paradox, where technological advancements make a product more efficient to use, but as the cost drops, the overall demand increases, causing total resource consumption to rise.

How much is ChatGPT worse for the environment than other computer tasks?

Some may argue that talking to AI is just using a computer and browsing the internet, but the resources needed to enable this task can hardly be compared to writing emails or watching YouTube.

Noman Bashir, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium, said that a GenAI training cluster might consume seven to eight times more energy than a typical computing workload.

For example, a web search using Google needs 10 times less electricity than a ChatGPT query, according to the World Economic Forum.

While people still use traditional engines more than ChatGPT for general search, the number of users turning to an AI model instead of Google is increasing, according to a 2024 survey by Future.

So, is ChatGPT really bad for the environment?

Based on the available information and academic research, it is clear that the AI industry as a whole is a major consumer of global energy and water resources, with a daily footprint comparable to that of small countries.

Furthermore, the prevailing industry motto appears to be “bigger is better”, which forces AI companies to keep increasing their model complexity and with it inevitably resource consumption.

As the world’s favourite chatbot, OpenAI’s model is posed as a main contributor to the issue, but the conclusion is mostly based on estimations.

A spokesperson for OpenAI told The Sun: “We’re focused on making our AI more capable and more efficient. Each generation of our models uses less energy to train and run, and our new Norway data centre will be powered entirely with renewable energy.

“We’re also teaming up with researchers to explore how AI can speed up progress on clean energy and climate science.”

Earlier this year, the company launched a multi‑year partnership with the US National Laboratories, which involves the scientists using Open AI’s models to advance scientific breakthroughs, including in energy.

As part of the firm’s NextGenAI initiative, the University of Oxford is also using OpenAI’s technology to expand research in new areas like climate change.

When looking at specific numbers, there are discrepancies between what AI companies report and what independent researchers say.

While Google has published a detailed methodology for its environmental metrics, the industry as a whole lacks some standardised and verifiable reporting standards to prevent greenwashing.

By focusing only on the direct, on-site water and electricity usage, tech companies report much less worrying numbers than academic researchers find through a more comprehensive, lifecycle approach.

This is not a problem unique to OpenAI, but a systemic issue that undermines the credibility of sustainability claims across the tech sector and highlights the need for more transparency and standardised reporting protocols.

Like any revolutionary tool, AI brings both controversy and promise — from privacy concerns and copyright disputes to breakthroughs in education, healthcare and accessibility.

Raising awareness of these dimensions, including the often-overlooked environmental impact, allows us to make more informed and responsible choices about how we use it.

After all, like the Genie and his three wishes, ChatGPT’s power is not infinite and shouldn’t be taken for granted.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Palmer Luckey’s Anduril launches EagleEye military helmet with help from buddy Zuck
Next Article Today is the last day to get £120 off the best selling Amazon Fire Max 11
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

You have just hours left to save 45% on the Echo Dot smart speaker
Gadget
Building with Clarity: Yusuke Kawano on Scaling Startups Through Data Discipline | HackerNoon
Computing
Steam Deck Owners Can Save $100s On Games – Here’s How – BGR
News
Windows 10 life support ends Oct. 14. Here’s what will happen.
Software

You Might also Like

News

Steam Deck Owners Can Save $100s On Games – Here’s How – BGR

6 Min Read
News

Google Photos working on new editing tools to let you put your best face forward

3 Min Read
News

You can now buy Nvidia’s personal desktop ‘AI supercomputers’

2 Min Read
News

Microsoft and the UAE: Driving AI from strategy to real impact | Computer Weekly

4 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?