Amazon Web Services CEO Matt Garman has delivered a pointed technical rebuttal to Elon Musk's vision of space-based data centers, highlighting fundamental engineering constraints that could keep computation firmly planted on Earth for the foreseeable future.
Speaking at the Cisco AI Summit in San Francisco, Garman identified mass and launch costs as the primary obstacles. "I don't know if you've seen a rack of servers lately: They're heavy," he noted, emphasizing that current transportation economics make orbital data centers economically prohibitive.
The technical reality supports his skepticism. A modern server rack weighs 1,000-2,000 kilograms fully loaded. Even with SpaceX's Falcon Heavy—currently the world's most powerful operational rocket—launch costs run approximately $1,500 per kilogram to low-Earth orbit. That means $1.5-3 million just to launch a single server rack, before considering power systems, cooling infrastructure, radiation shielding, and the data center building itself.
In space exploration, as across technological frontiers, engineering constraints meet human ambition—and occasionally, we achieve the impossible. But space-based data centers face a constellation of challenges that go far beyond launch costs.
Cooling presents perhaps the most fundamental problem. On Earth, data centers use ambient air or water cooling to dissipate the enormous heat generated by thousands of processors. In the vacuum of space, convection doesn't work. Heat can only be rejected through thermal radiation, requiring massive radiator arrays—which themselves add weight and complexity.
Power generation compounds the difficulty. Modern AI data centers consume hundreds of megawatts. The International Space Station generates just 120 kilowatts from its enormous solar arrays, and that power supports a crew of seven plus life support systems. Generating data center-scale power in orbit would require solar arrays covering acres, or nuclear reactors that introduce their own regulatory and engineering challenges.
Latency creates another constraint. For many applications, the speed-of-light delay to orbital facilities—measured in milliseconds—matters less than fiber optic latency. But for AI training and interactive workloads, round-trip times of 10-50 milliseconds to low-Earth orbit introduce noticeable lag. Higher orbits, which offer more stable thermal environments and less atmospheric drag, push latency even higher.
Garman acknowledged that advancing technology could eventually shift the economics. SpaceX plans to launch thousands of Starlink satellites, and fully reusable rockets like Starship aim to reduce launch costs by another order of magnitude. "Improvements in rocket fuel efficiency" could make orbital deployment more practical, he conceded.
Yet even at $100 per kilogram—an optimistic target for Starship—a 100-megawatt data center equivalent would require launching thousands of tons of equipment. The ISS, humanity's largest space structure, weighs about 420 metric tons and took decades to assemble. A space data center of useful scale would dwarf the ISS in both mass and complexity.
The comments came one day after Musk announced plans to merge SpaceX with his AI company xAI in a deal valued at approximately $1.25 trillion. Musk has framed space-based computation as enabling lunar bases, Martian settlements, and interplanetary expansion—contexts where orbital data centers might indeed make sense despite their costs.
For lunar or Mars operations, bringing computation closer to the point of use could justify the expense of space-based hardware. Communication delays between Earth and Mars range from 4 to 24 minutes one-way, making real-time control of robots or spacecraft impossible without local computing power. Space-based AI could enable autonomous systems essential for off-world settlement.
But for serving Earth-based users, the engineering case remains weak. Ground-based data centers benefit from cheap power, effective cooling, easy maintenance, straightforward upgrades, and proximity to fiber optic networks carrying exabytes of data daily. Orbital facilities offer none of these advantages.
Garman also noted that humanity has yet to establish permanent structures in space beyond the ISS, which requires constant maintenance and regular resupply missions. Data centers demand high reliability—five-nines uptime or better. Achieving that in space, where sending a repair technician costs millions and takes weeks, introduces operational challenges the industry has never confronted.
The debate reflects broader questions about where space infrastructure makes technical and economic sense. Communications satellites succeeded because they could serve customers impossible to reach any other way. GPS works because geometry requires satellites in specific orbits. Space telescopes access wavelengths blocked by Earth's atmosphere.
Data centers offer no similar inherent advantage from orbital deployment. The physics actively works against them. Unless applications emerge that specifically require space-based computation—and can justify costs orders of magnitude higher than ground facilities— the cloud will likely remain firmly terrestrial for decades to come.
Still, Musk has made a career of achieving engineering feats experts deemed impractical. Perhaps in space exploration, as across technological frontiers, engineering constraints really do occasionally yield to human ambition. But this time, the laws of thermodynamics and orbital mechanics may prove more stubborn than the skeptics Musk has overruled before.

