Friday, 12 December 2025
25.2 C
Singapore
20.6 C
Thailand
20.6 C
Indonesia
27.3 C
Philippines

Google explores orbital data centres for sustainable AI computing

Google explores powering AI from space with Project Suncatcher, aiming to use solar-powered satellites for sustainable data processing.

Google is exploring a bold new frontier for artificial intelligence (AI) by considering the placement of data centres in space. The concept, known as Project Suncatcher, aims to power AI operations using satellites equipped with solar panels, harnessing continuous solar energy to reduce carbon emissions and reliance on terrestrial power sources.

Aiming for clean, limitless energy in orbit

Project Suncatcher represents a return to the kind of ambitious, experimental initiatives that once defined Google’s research division. The company’s plan involves launching its Tensor Processing Units (TPUs) — specialised chips designed for AI workloads — aboard satellites powered entirely by solar energy.

“In the future, space may be the best place to scale AI compute,” said Travis Beals, Google’s senior director. “In the right orbit, a solar panel can be up to eight times more productive than on Earth, and produce power nearly continuously, reducing the need for batteries.”

By placing data centres in orbit, Google hopes to harness the sun’s uninterrupted energy while eliminating the environmental costs associated with ground-based energy consumption. This approach could help address one of AI’s most pressing sustainability issues — the vast amounts of electricity required to train and run large-scale machine learning models, which often rely on fossil-fuel-powered grids.

Technical challenges in space-based computing

While the vision is innovative, the reality of running AI hardware in space presents formidable engineering challenges. Electronic components, particularly chips like Google’s TPUs, face significant risks from radiation exposure in orbit, which can degrade or destroy delicate circuits.

However, Google stated that its existing chips have already undergone radiation testing and can operate for up to five years in space without permanent failure.

Another major obstacle lies in the transmission of data. AI models require high-speed connections for rapid communication between computing units. In space, maintaining data transfer speeds of “tens of terabits per second” with low latency is highly challenging due to power limitations and the distances involved.

To overcome this, Google is exploring the possibility of positioning satellites in close formations — within just a few kilometres of each other — to enhance communication speeds. This setup could also reduce the energy needed for station-keeping, or maintaining satellites in stable orbits.

Economic feasibility and upcoming trials

Beyond technical hurdles, cost remains the biggest question. Launching and maintaining AI hardware in orbit is expensive. Still, Google’s internal analysis suggests that by the mid-2030s, the energy efficiency of orbital AI operations could be “roughly comparable” to traditional data centres on Earth.

Project Suncatcher is still in the research stage, but Google plans to move from theory to testing within the decade. The company has partnered with Planet, a private satellite imaging firm, to conduct a “learning mission” by 2027. This trial will involve launching two prototype satellites equipped with TPUs into orbit.

“This experiment will test how our models and TPU hardware operate in space and validate the use of optical inter-satellite links for distributed ML [machine learning] tasks,” Google wrote in its research announcement.

If successful, the experiment could lay the groundwork for a new era of sustainable, off-world computing — one that could reshape how AI infrastructure is powered and scaled in the future.

Hot this week

Developers in Australia and India build new network API solutions at Nokia and Telstra hackathon

Developers create new prototypes using network APIs at Nokia and Telstra’s Connected Future Hackathon 2025.

AMD introduces EPYC Embedded 2005 series for compact, power-efficient AI systems

AMD launches the EPYC Embedded 2005 Series, offering compact, power-efficient processors for constrained networking, storage and industrial systems.

Samsung signals major step forward with new Exynos teaser

Samsung teases its Exynos 2600 chip, expected to debut as the first 2nm mobile processor and power the upcoming Galaxy S26 series.

Lofree introduces the Flow 2 low-profile mechanical keyboard for Mac users

Lofree’s Flow 2 brings improved low-profile mechanical typing to Mac users, with new POM switches, wireless support, and a solid build.

Pudu Robotics unveils new robot dog as it expands global presence

Pudu Robotics unveils its new D5 robot dog in Tokyo as part of its global push into service and industrial robotics.

PGL brings Counter-Strike 2 Major to Singapore in November 2026

PGL confirms the Counter-Strike 2 Major is coming to Singapore in November 2026, marking the first CS2 Major in Southeast Asia.

Denodo: Rethinking data architecture for AI agility and measurable ROI in Asia-Pacific

Denodo highlights how modern, composable data architectures powered by logical data management are helping Asia-Pacific enterprises accelerate AI adoption, ensure governance, and achieve measurable ROI.

Veeam completes acquisition of Securiti AI to build unified trusted data platform

Veeam completes its US$1.725 billion acquisition of Securiti AI to form a unified trusted data platform for secure and scalable AI adoption.

Enterprise AI adoption accelerates as organisations deepen workflow integration

A new OpenAI report shows rapid global growth in enterprise AI, rising productivity gains, and a widening gap between leading and lagging adopters.

Related Articles

Popular Categories