Can Space Solve AI’s Energy Crisis? The Physics Say No
Tech giants are betting billions on orbital data centers to escape terrestrial power constraints, but fundamental engineering barriers and environmental trade-offs suggest they're just moving the problem
Data centers will account for nearly half of U.S. electricity demand growth between now and 2030, driving tech executives to an improbable solution: launching compute into orbit. SpaceX has filed plans with the FCC for up to one million ‘orbital data center’ satellites to operate a constellation with unprecedented computing capacity, while Google’s Project Suncatcher envisions compact constellations of solar-powered satellites carrying TPUs, slated to launch prototype satellites by early 2027. On February 3rd, 2026, Starcloud submitted a proposal to the FCC for a constellation of up to 88,000 satellites for orbital data centers.
Global electricity consumption for Data Centers is projected to double to reach around 945 TWh by 2030 in the Base Case, representing just under 3% of total global electricity consumption. The promise is seductive: in the right orbit, a solar panel can be up to 8 times more productive than on earth, and produce power nearly continuously. But the reality confronting engineers is far harsher.
The Heat Problem Space Won’t Solve
“It’s counter intuitive, but it’s hard to actually cool things in Space because there’s no medium to transmit hot to cold,” Voyager CEO Dylan Taylor told CNBC. The physics are unforgiving. Vacuum behaves less like a freezer and more like a thermos—if something in space generates heat and can’t efficiently radiate it away, it simply keeps getting hotter.
In space, you can’t convect or conduct heat away: you only radiate it, and radiator area scales brutally for multi-MW loads—suppose you want to dump 1 MW of waste heat at 350 K, you are looking at approximately 1,200 m² of radiators per MW. To cool off, orbital platforms need large radiators that can dump heat into the vacuum of space, adding significant mass that has to be launched on rockets. For AI workloads generating hundreds of megawatts, this becomes the dominant mass constraint.
Launch Economics Don’t Add Up
The business case hinges on launch costs falling to levels that may never materialize. Analysis of historical and projected launch pricing data suggests that with a sustained learning rate, prices may fall to less than $200/kg by the mid-2030s—at that price point, the cost of launching and operating a space-based data center could become roughly comparable to the reported Energy costs of an equivalent terrestrial data center.
The reusable Falcon 9 delivers, today, a cost to orbit of roughly $3,600/kg—making space data centers doable will require prices closer to $200/kg, an 18-fold improvement. But even that assumes technical miracles. A 1 GW orbital data center might cost $42.4 billion—almost 3x its ground-bound equivalent, thanks to the up-front costs of building the satellites and launching them to orbit.
SpaceX’s Starlink satellites get their power from on-board solar panels, but the cost of acquiring, launching, and maintaining those spacecraft delivers energy at $14,700 per kW over a year, compared to $570 to $3,000 for a kW of power over a year for ground-based data centers.
SpaceX will not want to charge much less than its best competitor—if Blue Origin’s New Glenn rocket retails at $70 million, SpaceX won’t take on Starship missions for external customers at much less than that, which would leave it above the numbers publicly assumed by space data center builders. Market dynamics trump Moore’s Law.
Environmental Costs, Not Savings
Proponents frame orbital data centers as carbon-neutral. The data tell a different story. Starcloud estimates that a solar-powered space data center could achieve 10 times lower carbon emissions compared with a land-based data center powered by natural gas generators, but researchers at Saarland University calculated that an orbital data center powered by solar energy could still create an order of magnitude greater emissions than a data center on Earth, taking into account the emissions from rocket launches and reentry of spacecraft components.
Most of those extra emissions come from burning rocket stages and hardware on reentry, according to research published as “Dirty Bits in Low-Earth Orbit.” Chemical rockets put out huge plumes of polluting exhaust, and for a massive machine like the SpaceX Starship, with 33 first-stage engines, that can add up—especially with hundreds of launches planned—but proponents of the industry claim that the environmental costs still net out as a plus since the space data centers take processing off the fossil-fuel-burning grid.
Orbital Debris and the Kessler Threat
The scale of proposed constellations raises existential risks for spaceflight itself. Competing in the AI market would require launching hundreds of thousands, if not millions, of satellites into space—this would utterly dwarf the roughly 15,000 satellites that are currently orbiting the earth, and satellite deployments at this scale would dramatically increase the risk of Kessler syndrome.
The Kessler syndrome describes a situation in which the density of objects in low Earth orbit becomes so high due to space pollution that collisions between these objects cascade, exponentially increasing the amount of space debris over time—this proliferation of debris poses significant risks to satellites, space missions, and the International Space Station, potentially rendering certain orbital regions unusable. ESA’s debris modelling tool MASTER shows that in the low-Earth orbit range of around 550 km altitude there is now the same order of magnitude of debris objects posing a threat as there are active satellites.
Technology obsolescence of AI data centre being a concern and difficult maintenance in space imply the single-use purpose of those space data centres. Satellites can’t be upgraded at scale—today when a new generation of AI hardware is released, companies can start rolling them out across their data centers immediately, but in space, you have to launch a whole new fleet of a gazillion satellites.
Technical Barriers Remain Insurmountable
Beyond heat and launch costs, fundamental engineering challenges persist. Computing hardware must be protected from high radiation, through either shielding or error-correcting software. Rocket launch costs alone pose a significant challenge to building large orbital data centers, not to mention the need to replace onboard chips every five to six years.
For many AI workloads, communicating with data centers via satellites would be slower and less energy-efficient than using fiber-connected facilities on the ground—if you have data centers on earth, fiber connections will always be faster and more efficient than sending every prompt to orbit. Latency matters for inference workloads, and physics imposes hard limits.
The consensus among experts is that small pilot projects may emerge by the end of the decade—but not anything approaching the scale of today’s terrestrial data centers.
according to Fortune
What to Watch
Google and Starcloud’s 2027 test missions will provide the first real-world data on orbital AI economics. Watch whether they can demonstrate:
Radiative cooling at multi-kilowatt scale without prohibitive mass penalties
Cost-per-inference competitive with Virginia hyperscale facilities
Reliable operations through radiation exposure and thermal cycling
Satellite constellation collision avoidance at proposed densities
The orbital data center race is less about solving AI’s energy crisis than deferring it—trading terrestrial grid constraints for orbital debris risks, and immediate carbon emissions for launch pollution amortized over five-year hardware lifespans. The physics remain unforgiving: space is hard, and cooling in a vacuum is harder. Until launch costs fall 18-fold and radiator mass stops scaling linearly with compute, AI’s energy problem stays firmly grounded.