What’s an AI company without GPUs? Not much more than an ambitious investor pitch.
OpenAI CEO Sam Altman has admitted twice in the past month that it underestimated the computing power needing for two major launches: GPT-4.5 and ChatGPT’s image generator.
When GPT-4.5 came out in late February, OpenAI hailed it as its best model yet (as AI companies tend to do), with better emotional intelligence and “vibes.” But the company restricted availability to its $200/month Pro tier, citing GPU shortages.
“It is a giant, expensive model,” Altman said at the time. “We really wanted to launch it to Plus and Pro at the same time, but we’ve been growing a lot and are out of GPUs. This isn’t how we want to operate, but it’s hard to perfectly predict growth surges that lead to GPU shortages.”
This week, the same rigmarole happened with the new image generator. The company shouted the release from the rooftops and created hype on social media, largely through Studio Ghibli-inspired images that raised alarms about copyright infringement. Following high demand, Altman announced a limit on how many images users could create, again citing overworked GPUs.
“It’s super fun seeing people love images in ChatGPT, but our GPUs are melting,” he said. “We are going to temporarily introduce some rate limits while we work on making it more efficient. Hopefully won’t be long!”
This Tweet is currently unavailable. It might be loading or has been removed.
These GPU shortfalls could slow mass market AI adoption. In a September 2024 blog post, Altman wrote, “If we don’t build enough infrastructure, AI will be a very limited resource that wars get fought over and that becomes mostly a tool for rich people.”
Recommended by Our Editors
A few months later, President Trump tapped Altman to be one of the faces of a $500 billion AI infrastructure investment, dubbed Stargate, which “will be building the physical and virtual infrastructure to power the next generation of advancements in AI,” Trump said in January.
Meanwhile, China’s DeepSeek model challenges the premise that OpenAI needs more money and infrastructure. Though its claims can’t be verified, it may be able to perform at the same level as OpenAI’s models with a fraction of the computing power. (OpenAI claims somewhat ironically that DeepSeek used OpenAI models for training.)
OpenAI mostly relies on chips from Nvidia, including its upcoming Blackwell chip that promises huge performance gains and 25% less energy consumption. It has plans to expand to AMD’s offerings in 2026 and is developing its own chip as well, The Verge reports.
Get Our Best Stories!
What’s New Now
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!