As synthetic intelligence (AI) fashions continue to grow and getting extra power-hungry, researchers are beginning to ask not whether or not they are often skilled — however the place. That’s the context behind Google Analysis’s current proposal to discover space-based AI infrastructure, an concept that sits someplace between critical science and orbital overreach.
The thought, dubbed “Mission Suncatcher” and outlined in a examine uploaded Nov. 22 to the preprint arXiv database, explores whether or not future AI workloads could possibly be run on constellations of satellites outfitted with specialised accelerators and powered primarily by photo voltaic power.
The push to look past Earth for AI infrastructure isn’t popping out of nowhere. Knowledge facilities already devour a non-trivial slice of the world’s energy provide: current estimates put international data-center electrical energy use at roughly 415 terawatt-hours in 2024, or about 1.5% of complete international electrical energy consumption, with projections suggesting this might greater than double by 2030 as AI workloads surge.
Utilities within the U.S. are already planning for knowledge facilities, pushed largely by AI workloads, to account for between 6.7-12% of complete electrical energy demand in some areas by 2028, prompting some executives to warn that there merely “isn’t sufficient power on the grid” to assist unchecked AI development with out important new technology capability.
In that context, proposals like space-based knowledge facilities begin to learn much less like sci-fi indulgence and extra like a symptom of an trade confronting the bodily limits of Earth-bound power and cooling. On paper, space-based knowledge facilities sound like a sublime answer. In observe, some consultants are unconvinced.
Reaching for the celebrities
Joe Morgan, COO of knowledge heart infrastructure agency Patmos, is blunt in regards to the near-term prospects. “What gained’t occur in 2026 is the entire ‘knowledge facilities in house’ factor,” he informed Stay Science. “One of many tech billionaires would possibly really get near doing it, however other than bragging rights, why?”
Morgan factors out that the trade has repeatedly flirted with excessive cooling ideas, from mineral-oil immersion to subsea amenities, solely to desert them as soon as operational realities chunk. “There’s nonetheless hype about constructing knowledge facilities below the ocean, however any thermal advantages are far outweighed by the issue of changing parts,” he stated, noting that {hardware} churn is key to fashionable computing.
That churn is central to the skepticism round orbital AI. GPUs and specialised accelerators depreciate rapidly as new architectures ship step-change enhancements each few years. On Earth, racks might be swapped, boards changed and methods upgraded repeatedly. In orbit, each restore requires launches, docking or robotic servicing — none of which scale simply or cheaply.
“Who needs to take a spaceship to replace the orbital infrastructure yearly or two?” Morgan asks. “What if a significant part breaks? Really, overlook that, what in regards to the latency?”
Latency isn’t a footnote. Most AI workloads rely upon tightly coupled methods with extraordinarily quick interconnects, each inside knowledge facilities and between them. Google’s proposal leans closely on laser-based inter-satellite hyperlinks to imitate these connections, however the physics stays unforgiving. Even at low Earth orbit, round-trip latency to floor stations is unavoidable.
“Placing the servers in orbit is a silly concept, until your clients are additionally in orbit,” Morgan stated. However not everybody agrees it needs to be dismissed so rapidly. Paul Kostek, a senior member of IEEE and methods engineer at Air Direct Options, stated the curiosity displays real bodily pressures on terrestrial infrastructure.
“The curiosity in putting knowledge facilities in house has grown as the price of constructing facilities on earth retains rising,” Kostek stated. “There are a number of benefits to space-based or Moon-based facilities. First, entry to 24 hours a day of solar energy… and second, the power to chill the facilities by radiating extra warmth into house versus utilizing water.”
From a purely thermodynamic standpoint, these arguments are sound. Warmth rejection is without doubt one of the hardest limits on computation, and Earth-based knowledge facilities are more and more constrained by water availability, grid capability and native environmental opposition.
The backlash in opposition to terrestrial AI infrastructure isn’t restricted to power and water points; well being fears are more and more a part of the narrative. In Memphis, residents close to xAI’s large Colossus knowledge heart have voiced concern about air high quality and long-term respiratory impacts, with neighborhood members reporting worsened signs and worry of pollution-linked sicknesses because the facility started working. In different states, opponents of proposed hyperscale knowledge heart initiatives have framed their resistance round potential well being and environmental harms, arguing that giant amenities may degrade native air and water high quality and exacerbate present public well being burdens.
Placing knowledge facilities into orbit would take away some constraints, however substitute them with others.
Staying grounded
“The expertise questions that must be answered embody: Can the present processors utilized in knowledge facilities on Earth survive in house?” Kostek stated. “Will the processors be capable to survive photo voltaic storms or publicity to larger radiation on the Moon?”
Google researchers have already begun probing a few of these questions by early work on Mission Suncatcher. The crew describes radiation testing of its Tensor Processing Items (TPUs) and modeling of how tightly clustered satellite tv for pc formations may assist the high-bandwidth inter-satellite hyperlinks wanted for distributed computing. Even so, Kostek stresses that the work stays exploratory.
“Preliminary testing is being completed to find out the viability of space-based knowledge facilities,” he stated. “Whereas important technical hurdles stay and implementation continues to be a number of years away, this strategy may ultimately provide an efficient method to obtain enlargement.”
That phrase — enlargement — could also be the actual clue. For some researchers, essentially the most compelling rationale for off-world computing has little to do with serving Earth-based customers in any respect. Christophe Bosquillon, co-chair of the Moon Village Affiliation’s working group for Disruptive Know-how & Lunar Governance, argues that space-based knowledge facilities make extra sense as infrastructure for house itself.
“With humanity on monitor to quickly set up a everlasting lunar presence, an infrastructure spine for a future data-driven lunar trade and the cis-lunar economic system is warranted,” he informed Stay Science.
From this angle, space-based knowledge facilities aren’t substitutes for Earth’s infrastructure a lot as instruments for enabling house exercise, dealing with all the things from lunar sensor knowledge to autonomous methods and navigation.
“Reasonably priced power is a key problem for all actions and can embody a nuclear part subsequent to solar energy and arrays of gasoline cells and batteries,” Bosquillon stated, including that the challenges lengthen properly past engineering to governance, regulation and worldwide coordination.
Crucially, space-based computing may offload non-latency-sensitive workloads from Earth altogether. “Fixing the power drawback in house and taking that burden off the Earth to course of Earth-related non-latency-sensitive knowledge… has advantage,” Bosquillon stated, even extending to the concept of house and the Moon as a safe vault for “civilisational” knowledge.
Seen this fashion, Google’s proposal seems much less like an answer to at the moment’s knowledge heart shortages and extra like a probe into the long-term physics of computation. As AI approaches planetary-scale power consumption, the query is probably not whether or not Earth has sufficient capability, however whether or not researchers can afford to disregard environments the place power is considerable however all the things else is difficult.
For now, space-based AI stays strictly experimental. Whether or not it ever escapes Earth’s gravity might rely much less on photo voltaic panels and lasers than on how determined the power race turns into.
