.png)
A holiday read on the role of space in the AI revolution
Hello everyone, and welcome back to the Ecosmic Compiler.
I’m Matteo, I take care of Business Development at Ecosmic. In the past few weeks, you might have heard more and more people talking about putting data centres in space, and maybe you’ve asked yourself why. To begin answering that question, we’ll start with some breaking news: this AI thing you might have heard of is a pretty big deal.
How big, you ask? Well-
.png)
Data centre buildout is the most visible physical manifestation of the AI boom. Data centres power both training (long and expensive runs to develop new models), as well as inference (serving answers to users).
That’s why major AI stakeholders are rolling out infrastructure buildout programmes on a scale rarely seen in the tech industry. A good example is Project Stargate, a $500B project backed by OpenAI and SoftBank, among others.
Stargate’s first campus in Abilene, Texas, has around 372,000 m2 of data centre space. That would be the equivalent of 70 American football fields or, for those like us who attend Space Tech Expo Europe more often than NFL games, around 10 Bremen Congress Centres. Pretty big, right?
An energy problem
Footprint is easy enough to understand. But to see the real scale of data centre buildout we should be looking at power consumption. Project Stargate is pitched at 10 gigawatts of data centre capacity when fully deployed. The Lawrence Berkeley National Laboratory estimated1 that yearly AI-specific data centre energy consumption could become as high as 326 terawatt-hours by 2028. To give a comparison, Thailand’s yearly energy consumption sat at around 200 terawatt-hours.2
Energy consumption is a good indicator of the scale of today’s data centre buildout. But together with permitting and regulatory issues, it’s also the main constraint to the kind of scaling major players envisage. Sam Altman, OpenAI’s CEO, was pretty explicit about the issue of energy consumption for AI Data Centres:
“We still don’t appreciate the energy needs of [AI] … There’s no way to get there without a breakthrough. We need fusion or we need, like, radically cheaper solar plus storage, or something at massive scale”3
The AI scaling problem is becoming, fundamentally, an energy problem. Energy infrastructure is notoriously complex to plan, approve, and deploy, and the scale of grid capacity needed in such a short timeframe is, to put it bluntly, unprecedented. Solving AI’s energy problem isn’t trivial. Altman mentioned fusion reactors. Quite conveniently, we happen to be orbiting one right now.
When things get hard, ship them to space

That is where space comes in: in the right orbit, solar arrays can operate at a much higher availability factor than on Earth, where they are limited by the day-night cycle, weather, and other atmospheric factors.
Additionally, space could offer easier scalability than ground infrastructure. Expanding a Data Centre on Earth requires going through complex approval and permitting processes, which add time and friction on top of the energy supply challenges. Expanding a space-based data centre should instead be as “simple” as launching a satellite: hard, expensive, but doable.
The space economy also creates its own demand: more and more data is being generated in orbit, and transmitting it to the ground is often limited by downlink capacity.
This has led companies and governmental entities to put forward plans for space-based data centres: Starcloud is proposing4 massive modular data centres with 4km x 4km solar arrays, Google’s Project Suncatcher envisages5 a constellation of 81 satellites flying only hundreds of meters apart to enable high-bandwidth inter-satellite links, and the European Space Policy Institute advocates6 for a European orbital data centre effort as a question of technological sovereignty.
In addition to those, Blue Origin has reportedly been working on space-based data centres for more than one year and SpaceX is also rumoured to be doing the same.7 Both companies also provide launch services, which surely comes in handy given that launch costs are one if not the main factor limiting the competitiveness of orbital computing.
The big bet

Ultimately, all of this will come down to a question of economics. And as any space enthusiast would know, behind most economics questions there are usually a few technology challenges waiting to be solved. Can you bring launch costs down to $200/kg to LEO? $30/kg? Can you keep data centres running in orbit, where repairs are hard? How do you handle radiation? How do you dump all that waste heat into space? And, finally, how does all that compare to the cost, speed and energy necessary to deploy DCs on Earth?
That is a lot of question marks for one paragraph. If you want to dig deeper into the economics of data centres in space, Varda’s Andrew McCalip has put together a great interactive model that lets you play with assumptions and see how orbital data centres can make economic sense.
Big players are betting that the extra complexity will be worth it in the end and are starting to put money behind the idea. But even if you make the economics work, there’s another issue waiting, one that plagues the space industry more generally: safety and security.
Here be dragons (and space debris)
Today more than ever, space infrastructure is critical infrastructure. Much of the data keeping the world economy running and providing militaries with a strategic edge comes from space. As the focus across the industry shifts from raw data to insight, processing and compute are also going orbital, further pushing demand for space-based data centres.
Before, disruptions in orbit could cost a decision-maker precious data points, which is bad enough in itself. With space-based data centres, the risk is losing the ability to process in the first place. And with AI having become a flashpoint of geopolitical competition, the ability to train models and rely on them for inference in critical tasks is a vital strategic asset. Moving the infrastructure that makes that possible in orbit without a plan in place to keep it safe is calling for trouble.
Space weather and collision risks are very real threats. ESA estimates around 1.2 million pieces of debris larger than 1cm in orbit.8 Considering the current solar maximum and the ever-increasing launch rate, the menace they pose is only growing. Then you have hostile actors, developing increasingly sophisticated counterspace capabilities and targeting commercial spacecraft as dual-use assets.
Beyond the disastrous, then, there’s the mundane (but no less critical): space hazards create a whole new class of availability risks. Collision alerts and space weather, even when not fatal, could force data centres into downtime. This is a challenge that proponents of space-based data centres are aware of: Starcloud cites9 “on-orbit safety and orbital debris mitigation concerns” as a key hurdle in its white paper.
Space today is more vibrant and alive than ever, but it’s also never been more congested and contested, still plagued by far too many unknown unknowns. That is where Space Domain Awareness comes in: clarity on the orbital environment is a fundamental enabler of space operations. For critical infrastructure, that enabler is non-negotiable.
Collision risk estimation and manoeuvre planning save time, money and sometimes entire spacecraft from disruption (our white paper shows just how much good collision risk estimation can make a difference). Advanced orbit determination enables complex constellation designs, and automated threat detection and response safeguards critical assets from hostile action. That is at the core of what Ecosmic does: lifting the fog in orbit, extracting insight, and enabling operators to act with clarity.
Space-based data centres could very well be “the next big thing” in space. Like any big shift, it comes with its own risks and operational challenges, whether that’s solving the problem of close-knit formation flying in Google’s case or safely operating massive arrays for Starcloud. Those challenges are really opportunities: the Space Domain Awareness solutions that will be needed to scale space-based data centres can raise the baseline for the whole industry. It’s long past time to stop treating space as a faraway frontier and unlock its full potential as an operational domain where we can build, act, and scale with confidence.
A few updates
Ecosmic has officially joined the NATO DIANA Accelerator Programme in the 2026 cohort, tackling the challenge of Resilient Space Operations with our defence product Valkyr, a 360° Operating System for space security covering threat detection, characterisation, and response planning.
If you want to connect with us in person, you can always pass by our office in Turin for a tasty espresso. Otherwise, you will also find us here:
- 20-23 January: Winter Satellite Workshop @ Espoo, Finland.
- 10-12 February: European Defence and Security Days @ Munich, Germany.
As the year winds down, we’d like to thank you for reading the Ecosmic Compiler and following our journey. Enjoy the holidays, get some time to rest, and we’ll see you in the new year with exciting updates!

1 Report by the Lawrence Berkley National Laboratory, via the MIT Technology Review.
2 As reported by the MIT Technology Review.
3 As reported by Fortune.
4 White Paper by Starcloud.
5 Blog Article and pre-print Paper by Google.
6 Report by the European Space Policy Institute.
7 As reported by the Wall Street Journal.
8 Space Environment Report 2025 by ESA.
9 White Paper by Starcloud.