To infinity and beyond
Google is already zapping TPUs with radiation to get ready.
Credit: Google
The tech industry is on a tear, building data centers for AI as quickly as they can buy up the land. The sky-high energy costs and logistical headaches of managing all those data centers have prompted interest in space-based infrastructure. Moguls like Jeff Bezos and Elon Musk have mused about putting GPUs in space, and now Google confirms it’s w…
To infinity and beyond
Google is already zapping TPUs with radiation to get ready.
Credit: Google
The tech industry is on a tear, building data centers for AI as quickly as they can buy up the land. The sky-high energy costs and logistical headaches of managing all those data centers have prompted interest in space-based infrastructure. Moguls like Jeff Bezos and Elon Musk have mused about putting GPUs in space, and now Google confirms it’s working on its own version of the technology. The company’s latest “moonshot” is known as Project Suncatcher, and if all goes as planned, Google hopes it will lead to scalable networks of orbiting TPUs.
The space around earth has changed a lot in the last few years. A new generation of satellite constellations like Starlink has shown it’s feasible to relay Internet communication via orbital systems. Deploying high-performance AI accelerators in space along similar lines would be a boon to the industry’s never-ending build-out. Google notes that space may be “the best place to scale AI compute.”
Google’s vision for scalable orbiting data centers relies on solar-powered satellites with free-space optical links connecting the nodes into a distributed network. Naturally, there are numerous engineering challenges to solve before Project Suncatcher is real. As a reference, Google points to the long road from its first moonshot self-driving cars 15 years ago to the Waymo vehicles that are almost fully autonomous today.
Taking AI to space
Some of the benefits are obvious. Google’s vision for Suncatcher, as explained in a pre-print study (PDF), would place the satellites in a dawn-dusk sun-synchronous low-earth orbit. That ensures they would get almost constant sunlight exposure (hence the name). The cost of electricity on Earth is a problem for large data centers, and even moving them all to solar power wouldn’t get the job done. Google notes solar panels are up to eight times more efficient in orbit than they are on the surface of Earth. Lots of uninterrupted sunlight at higher efficiency means more power for data processing.
A major sticking point is how you can keep satellites connected at high speeds as they orbit. On Earth, the nodes in a data center communicate via blazing-fast optical interconnect chips. Maintaining high-speed communication among the orbiting servers will require wireless solutions that can operate at tens of terabits per second. Early testing on Earth has demonstrated bidirectional speeds up to 1.6 Tbps—Google believes this can be scaled up over time.
Google’s proposed free-fall (“no thrust”) constellation for linked satellites; arrow pointing toward Earth.
However, there is the problem of physics. Received power decreases with the square of distance, so Google notes the satellites would have to maintain proximity of a kilometer or less. That would require a tighter formation than any currently operational constellation, but it should be workable. Google has developed analytical models suggesting that satellites positioned several hundred meters apart would require only “modest station-keeping maneuvers.”
Hardware designed for space is expensive and often less capable compared to terrestrial systems because the former needs to be hardened against extreme temperatures and radiation. Google’s approach to Project Suncatcher is to reuse the components used on Earth, which might not be very robust when you stuff them in a satellite. However, innovations like the Snapdragon-powered Mars Ingenuity helicopter have shown that off-the-shelf hardware may survive longer in space than we thought.
Google says Suncatcher only works if TPUs can run for at least five years, which works out to 750 rad. The company is testing this by blasting its latest v6e Cloud TPU (Trillium) in a 67MeV proton beam. Google says while the memory was most vulnerable to damage, the experiments showed that TPUs can handle about three times as much radiation (almost 2 krad) before data corruption was detected.
Google hopes to launch a pair of prototype satellites with TPUs by early 2027. It expects the launch cost of these first AI orbiters to be quite high. However, Google is planning for the mid-2030s when launch costs are projected to drop to as little as $200 per kilogram. At that level, space-based data centers could become as economical as the terrestrial versions.
The fact is, terrestrial data centers are dirty, noisy, and ravenous for power and water. This has led many communities to oppose plans to build them near the places where people live and work. Putting them in space could solve everyone’s problems (unless you’re an astronomer).
Ryan Whitwam is a senior technology reporter at Ars Technica, covering the ways Google, AI, and mobile technology continue to change the world. Over his 20-year career, he’s written for Android Police, ExtremeTech, Wirecutter, NY Times, and more. He has reviewed more phones than most people will ever own. You can follow him on Bluesky, where you will see photos of his dozens of mechanical keyboards.