With rising costs of data centers and the ongoing need for more space, power, and technology to deploy these enormous AI factories, SpaceX CEO Elon Musk said in 2025 that AI computers in outer space would be the lowest-cost option in five years’ time, signaling a paradigm shift for future tech. Musk’s prediction likely reflects SpaceX’s interests, but Jensen Huang, CEO of Nvidia, is a lot more cautious about this timeline. In the company’s latest earnings report, Huang said that today’s poor economics present an issue for space-based data centers, “but it’s going to improve over time.” Huang explains this is due to space itself being a lot different from life “down here.”
These aren’t the only reasons Huang isn’t bullish on extraterrestrial data centers, as the costs of leaving Earth are still exorbitantly expensive, the speed to change a defective part of an AI data center would be much slower, and creating a proper cooling method might be more challenging than expected. That said, the Nvidia executive isn’t saying it’s impossible to move AI data centers to space, but we might be a lot farther from that moment than Elon Musk thinks.
Cooling AI data centers in space is more challenging than you might think
Even though the temperature in outer space is colder, the more important factor at play is that space is pretty much a big vacuum, which isn’t great for dissipating heat. Once an extraterrestrial AI data center started to generate heat, it couldn’t rely on convection to carry that heat away — in other words, the most common cooling methods used on Earth wouldn’t work in space because there’s no air.
Currently, the best way to dissipate heat in space is with a thermal radiator like the Active Thermal Control System NASA uses in its spacecraft. While the technology exists and works well, adapting and applying it to a data center in space would present challenges, as huge radiators would add even more weight to the structure that needs to be sent to space.
Mind you, none of this is impossible, as private companies in the U.S. and state-backed efforts in China are already working on sending data centers into orbit. Rather, Nvidia’s Jensen Huang is being pragmatic and saying that higher costs aren’t worth it right now.
Leaving Earth is expensive, and cost is a moving target
As a company, SpaceX has been at the forefront of improving the cost effectiveness of space travel. After all, with several parts of its spaceships being able to safely land back on Earth, a lot of the invested money isn’t just burned away. Still, if it’s a choice between getting a local government to agree to tax and environmental exemptions or shipping an entire state-of-the-art structure into space, most companies will choose the first option.
Besides that, the cost of sending things into space is very volatile. As noted by Senior Space Industry Analyst Alexandre Najjar on X, SpaceX Falcon 9 prices recently increased from $70 million to $74 million per launch, and from $6,000 to $7,000 per kilogram of payload. Since there aren’t many alternatives for shipping materials into space, SpaceX’s pricing volatility could make it hard to develop a strategy for deploying an extraterrestrial data center while planning months and years ahead.
Even after getting all of the materials into space and safely assembled, a data center would be exposed to other uncertainties like cosmic rays and solar storm radiation. Or, as Jensen Huang put it, “the way that space works is radically different than how it works down here.”
There’s a lot of energy available, but it comes with a catch
Without clouds, atmosphere, or night cycles, AI data centers could run entirely on solar energy, just like the solar-powered gadgets you can use around the home to cut down on energy costs. However, this unlimited energy source wouldn’t make the job of dissipating heat any simpler.
After all, much of the energy an AI data center used would ultimately be converted into heat that, as we explained earlier, would likely need to be dissipated by thermal radiators. In other words, a space data center couldn’t simply scale up energy consumption because every extra watt would also create more heat that has to be radiated away. So even if energy is plentiful in space, the limiting factor isn’t power supply — it’s heat dissipation.
With that in mind, a future where AI data centers are common in space might be a few breakthrough discoveries away. Or, as Nvidia’s Jensen Huang put it, AI’s future success is likely going to require some pain and suffering.
