A wide-ranging interview where Elon Musk makes the case that within 36 months, space will be the cheapest place to run AI — and that the real bottleneck for AI scaling isn't chips or algorithms, but electricity and hardware.
The Energy Wall
Electrical output outside of China is essentially flat. Chip production is growing exponentially, but there's no matching growth in power generation.
"The output of chips is growing pretty much exponentially, but the output of electricity is flat. So how are you going to turn the chips on? Magical electricity fairies?"
- Data center power is massively underestimated: Beyond GPUs, you need power for networking, CPU/storage, cooling (40% overhead on hot days), and maintenance margins (20-25% reserve). ~330,000 GB300s requires roughly a gigawatt at the generation level.
- Gas turbines are sold out through 2030. The bottleneck is the vanes and blades — only three casting companies in the world make them.
- Solar tariffs in the US are "several hundred percent" and domestic production is minimal. Both Tesla and SpaceX are building toward 100 GW/year of solar cell production.
Why Space Wins
- 5x solar effectiveness: No day-night cycle, no clouds, no atmosphere (which alone causes ~30% energy loss). No batteries needed.
- Cheaper solar cells: Space panels don't need heavy glass or framing since there's no weather. Combined with 5x effectiveness and no batteries, it's ~10x cheaper per watt.
- No permits: Getting permits to cover land in solar panels is extremely difficult. Space is a regulatory arbitrage play as much as a physics one.
- Scalability: Once you think in terms of what percentage of the Sun's power you're harnessing, you realize Earth has hard limits. Space doesn't.
"In 36 months, but probably closer to 30 months, the most economically compelling place to put AI will be space."
The Scale of the Vision
Musk's prediction for five years out: SpaceX will be launching more AI capacity into space each year than the cumulative total on Earth. That means hundreds of gigawatts per year, scaling toward a terawatt.
- 100 GW would require ~10,000 Starship launches per year — roughly one per hour
- Could be done with as few as 20-30 physical Starships on ~30-hour turnaround cycles
- SpaceX is gearing up for 10,000-30,000 launches per year
- SpaceX would effectively become a "hyper-hyperscaler"
Hardware Is the Real Moat
"Those who have lived in software land don't realize they're about to have a hard lesson in hardware."
Musk's view on AI competition: algorithmic innovations flow between companies within ~6 months. The real differentiator is who can scale hardware fastest — power generation, chip deployment, and infrastructure.
- 1-year bottleneck: Energy and power production
- 3-4 year bottleneck: Chips
- By end of this year, chip output will exceed the ability to turn them on
The bottom line: The future of AI infrastructure isn't about better algorithms — it's about who can generate and deploy the most electricity, and Musk believes that means going to space.