Google is exploring a bold new frontier for artificial intelligence (AI) by considering the placement of data centres in space. The concept, known as Project Suncatcher, aims to power AI operations using satellites equipped with solar panels, harnessing continuous solar energy to reduce carbon emissions and reliance on terrestrial power sources.
Aiming for clean, limitless energy in orbit
Project Suncatcher represents a return to the kind of ambitious, experimental initiatives that once defined Google’s research division. The company’s plan involves launching its Tensor Processing Units (TPUs) — specialised chips designed for AI workloads — aboard satellites powered entirely by solar energy.
“In the future, space may be the best place to scale AI compute,” said Travis Beals, Google’s senior director. “In the right orbit, a solar panel can be up to eight times more productive than on Earth, and produce power nearly continuously, reducing the need for batteries.”
By placing data centres in orbit, Google hopes to harness the sun’s uninterrupted energy while eliminating the environmental costs associated with ground-based energy consumption. This approach could help address one of AI’s most pressing sustainability issues — the vast amounts of electricity required to train and run large-scale machine learning models, which often rely on fossil-fuel-powered grids.
Technical challenges in space-based computing
While the vision is innovative, the reality of running AI hardware in space presents formidable engineering challenges. Electronic components, particularly chips like Google’s TPUs, face significant risks from radiation exposure in orbit, which can degrade or destroy delicate circuits.
However, Google stated that its existing chips have already undergone radiation testing and can operate for up to five years in space without permanent failure.
Another major obstacle lies in the transmission of data. AI models require high-speed connections for rapid communication between computing units. In space, maintaining data transfer speeds of “tens of terabits per second” with low latency is highly challenging due to power limitations and the distances involved.
To overcome this, Google is exploring the possibility of positioning satellites in close formations — within just a few kilometres of each other — to enhance communication speeds. This setup could also reduce the energy needed for station-keeping, or maintaining satellites in stable orbits.
Economic feasibility and upcoming trials
Beyond technical hurdles, cost remains the biggest question. Launching and maintaining AI hardware in orbit is expensive. Still, Google’s internal analysis suggests that by the mid-2030s, the energy efficiency of orbital AI operations could be “roughly comparable” to traditional data centres on Earth.
Project Suncatcher is still in the research stage, but Google plans to move from theory to testing within the decade. The company has partnered with Planet, a private satellite imaging firm, to conduct a “learning mission” by 2027. This trial will involve launching two prototype satellites equipped with TPUs into orbit.
“This experiment will test how our models and TPU hardware operate in space and validate the use of optical inter-satellite links for distributed ML [machine learning] tasks,” Google wrote in its research announcement.
If successful, the experiment could lay the groundwork for a new era of sustainable, off-world computing — one that could reshape how AI infrastructure is powered and scaled in the future.

