Author Topic: M&A: xAI, A SpaceX Company  (Read 178819 times)

Offline Twark_Main

  • Senior Member
  • *****
  • Posts: 5313
  • Technically we ALL live in space
  • Liked: 2787
  • Likes Given: 1604
Re: M&A: xAI, A SpaceX Company
« Reply #160 on: 12/06/2025 01:11 pm »
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear.

What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.
I'm pretty sure that it'll go down a lot faster than it did with thermodynamic engines.

Compared to thermodynamic engines, AI is in the early "play around and see what works" stage.

Eventually, with steam engines, we figured out the limiting laws that governed their operation (Carnot's efficiency limit and what would later become Odum's specific power limit), and this new theoretical understanding enabled rapid progress which quickly approached these limits.

I expect we'll see the same in AI, where we develop a "Carnot's limit" for the maximum algorithmic efficiency of computation and ML (ie a software equivalent of what Landauer's Principle is for compute hardware).

Eventually the early drama blows over as the field matures, all the clever tricks get incorporated into the status quo, and it all gets boiled to a boring flowchart sizing the "turbine" for a particular information processing task.
« Last Edit: 12/06/2025 01:45 pm by Twark_Main »

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17714
  • N. California
  • Liked: 18000
  • Likes Given: 1502
Re: M&A: xAI, A SpaceX Company
« Reply #161 on: 12/06/2025 01:16 pm »
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear.

What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.
I'm pretty sure that it'll go down a lot faster than it did with thermodynamic engines.

Compared to thermodynamic engines, AI is in the early "play around and see what works" stage.

Eventually, with steam engines, we figured out the limiting laws that governed their operation (Carnot's efficiency limit and what would later become Odum's specific power limit), and this new theoretical understanding enabled rapid progress which quickly approached that limit.

I expect we'll see the same in AI, where we develop a "Carnot's limit" for the maximum algorithmic efficiency of computation and ML (ie a software equivalent of what Landauer's Principle is for compute hardware).
Yup and cyber science in the 21st century moves a lot faster than the very primitive investigation of things like combustion dynamics that limited early 20th century development of thermodynamic engines (and are only robustly solved since maybe a couple decades ago)
ABCD - Always Be Counting Down

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3397
  • Liked: 1510
  • Likes Given: 208
Re: M&A: xAI, A SpaceX Company
« Reply #162 on: 12/07/2025 01:34 am »
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear.

What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.
I'm pretty sure that it'll go down a lot faster than it did with thermodynamic engines.

Compared to thermodynamic engines, AI is in the early "play around and see what works" stage.

Eventually, with steam engines, we figured out the limiting laws that governed their operation (Carnot's efficiency limit and what would later become Odum's specific power limit), and this new theoretical understanding enabled rapid progress which quickly approached that limit.

I expect we'll see the same in AI, where we develop a "Carnot's limit" for the maximum algorithmic efficiency of computation and ML (ie a software equivalent of what Landauer's Principle is for compute hardware).
Yup and cyber science in the 21st century moves a lot faster than the very primitive investigation of things like combustion dynamics that limited early 20th century development of thermodynamic engines (and are only robustly solved since maybe a couple decades ago)

There is, however, the possible opposite effect of the difference between a period of very rapid demand growth (due *both* to rapid population growth *and* incorporation of more and more of the world population into the industrial economy) and a period where both of those drivers of demand growth are far lower.

This is however a longer term issue than a potential investment bubble burst, which is a "next few years" question.

EDIT: the recent discussion of an IPO doesn't make me any more comfortable that this isn't a diversion from original core goals.
« Last Edit: 12/07/2025 03:35 am by Vultur »

Offline thespacecow

  • Full Member
  • ****
  • Posts: 1302
  • e/acc
  • Liked: 1225
  • Likes Given: 530
Re: M&A: xAI, A SpaceX Company
« Reply #163 on: 12/08/2025 08:37 am »
https://x.com/elonmusk/status/1997706687155720229

Quote
A major additional factor should be considered.

Satellites with localized AI compute, where just the results are beamed back from low-latency, sun-synchronous orbit, will be the lowest cost way to generate AI bitstreams in <3 years.

And by far the fastest way to scale within 4 years, because easy sources of electrical power are already hard to find on Earth. 1 megaton/year of satellites with 100kW per satellite yields 100GW of AI added per year with no operating or maintenance cost, connecting via high-bandwidth lasers to the Starlink constellation.

The level beyond that is constructing satellite factories on the Moon and using a mass driver (electromagnetic railgun) to accelerate AI satellites to lunar escape velocity without the need for rockets. That scales to >100TW/year of AI and enables non-trivial progress towards becoming a Kardashev II civilization.

Offline thespacecow

  • Full Member
  • ****
  • Posts: 1302
  • e/acc
  • Liked: 1225
  • Likes Given: 530
Re: M&A: xAI, A SpaceX Company
« Reply #164 on: 12/08/2025 08:40 am »
https://x.com/CJHandmer/status/1997906033168330816

Quote
Here's one idea about SpaceX's next big thing. AI computing (inference) on orbit, but how the hell can SpaceX do this cheaper than just building more datacenters on the ground?

From first principles, it's an attractive proposition because the GPUs have extreme value per kg and extreme revenue per kW, both of which are relatively expensive. That is, the value prop somewhat washes out the pain of operating in space.

So I took a closer look. If anyone can make this work, it's a Starlink-derived system, so I started with the Starlink v3 satellites, with some high fidelity CEO-cad below.

Orbital parameters. Pick a sun synchronous orbit - so we're in full 1400 kW/m^2 sunlight at all times. No need for batteries. Deploy the solar array in "sun slicer" mode, facing full sun, but the edge is pointing in the orbital direction (bottom right in these images) to minimize drag. But the inference Starlink (Star Thought?) satellites don't have to scrape the atmosphere. Being in SSO, they'll need to use the rest of the Starlink constellation for backhaul via laser links anyway, and higher orbits actually improve worst case latency (serving equatorial customers around midnight or noon) very slightly. Too high though, and SSO is relatively full of debris. Let's pick 560 km.

A Starlink satellite in this orbit has full sun, so the back half is always shaded and relatively cool. The next hottest thing in the sky is the Earth, taking up almost half the sky to the bottom left in these images. So set up an MLI heat reflector there too, and then on the back side of the main bus we can use passive radiation to cool right down. The non-shiny parts of space are very cold. An array of this size produces about 130 kW of electrical power.

In this model, perhaps 200 H100-equivalent GPUs are racked on the main bus, generating 13,000 tokens per second, nowhere near enough to saturate a laser link. At $10/token, that's $4m of revenue per year. Assuming an all in cost of $50,000/kW (dominated by the GPUs on a kW or kg basis), that's something like 60% ROI per year. Obviously assumptions can vary.

But this neglects a broader point.

Starlink satellites have to do all kinds of things that inference satellites don't have to, in particular, talk to millions of customers all over the world. If most of our power is just going to inference, would we change the design?

Each solar array module produces about 6 kW. Each GPU consumes about 700 W. SpaceX could actually install GPUs directly to the solar module, matching optimal voltage and current levels and connecting each back to the main bus not with an insulated high voltage power cable, but a local wifi connection. There's then no real limit on how many modules can connect to each bus, which drives the overall power density of the system towards the thinnest solar arrays than can be flown, probably close to 1 kg/m^2 in the limit.

That is, the inference is performed by a thin piece of silicon that always faces the sun, connected to a somewhat smaller piece of thin silicon with billions of logic gates in it. This helps the thermal problem too, because we're not bringing a bunch of power into one concentrated place.

If one Starship can launch 100 T to LEO, then that gets close to 30 MW of inference per launch. 1000 launches is 30 GW. Now we're talking real scale - and provided the revenue per kWh is greater than about $4.00, I think the economics work out.

I've seen a bunch of high inclination Starlink launches from Vandenberg recently, but I don't think any of them were going to SSO.

In any case, a ring of inference satellites visible at dawn and dusk running north south will be awesome.

Note by $10/token he meant $10 per million tokens
« Last Edit: 12/09/2025 01:44 am by thespacecow »

Offline jongoff

  • Recovering Rocket Plumber/Space Entrepreneur
  • Senior Member
  • *****
  • Posts: 7173
  • Erie, CO
  • Liked: 4809
  • Likes Given: 2749
Re: M&A: xAI, A SpaceX Company
« Reply #165 on: 12/08/2025 09:59 pm »
https://x.com/elonmusk/status/1997706687155720229

Quote
A major additional factor should be considered.

Satellites with localized AI compute, where just the results are beamed back from low-latency, sun-synchronous orbit, will be the lowest cost way to generate AI bitstreams in <3 years.

And by far the fastest way to scale within 4 years, because easy sources of electrical power are already hard to find on Earth. 1 megaton/year of satellites with 100kW per satellite yields 100GW of AI added per year with no operating or maintenance cost, connecting via high-bandwidth lasers to the Starlink constellation.

The level beyond that is constructing satellite factories on the Moon and using a mass driver (electromagnetic railgun) to accelerate AI satellites to lunar escape velocity without the need for rockets. That scales to >100TW/year of AI and enables non-trivial progress towards becoming a Kardashev II civilization.

Color me a little skeptical that they'll be launching 5000-10,000 Starship flights and building 1,000,000 1mT 100kW AI computing node satellites within the next 4yrs. Is it physically possible to build satellites in that quantity? Sure. But it has also taken SpaceX about a decade since starting Starlink and starting reusing F9 to get to ~170 flts/yr and ~3400 satellites per year. What's another few orders of magnitudes in scale-up between friends?

~Jon

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3397
  • Liked: 1510
  • Likes Given: 208
Re: M&A: xAI, A SpaceX Company
« Reply #166 on: 12/08/2025 11:44 pm »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.

Offline jongoff

  • Recovering Rocket Plumber/Space Entrepreneur
  • Senior Member
  • *****
  • Posts: 7173
  • Erie, CO
  • Liked: 4809
  • Likes Given: 2749
Re: M&A: xAI, A SpaceX Company
« Reply #167 on: 12/09/2025 01:26 am »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.


Bingo.

~Jon
« Last Edit: 12/09/2025 01:26 am by jongoff »

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3397
  • Liked: 1510
  • Likes Given: 208
Re: M&A: xAI, A SpaceX Company
« Reply #168 on: 12/09/2025 05:58 am »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.


Bingo.

~Jon

Which raises the question of why it's being said, and why now.

Whie I don't want to imply dishonesty* I think there is possibly a strong element of PR here. The "Moon as AI factory" thing only showed up after Acting Admin Duffy mentioned re-competing the Artemis HLS contract. This may be a move to show commitment to the Moon as a goal.

So far I don't think any money has been spent or hardware built on this Moon AI factory stuff, and it's probably quite a while out. If it looks like it won't work out (IMO the far more plausible outcome) it'll probably be quietly dropped.

.*I think it's an *honest* theoretical extreme - I absolutely think Elon Musk believes this is *possible*, at least eventually. That doesn't mean he thinks it's the most plausible or easiest path, though.
« Last Edit: 12/09/2025 06:01 am by Vultur »

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17714
  • N. California
  • Liked: 18000
  • Likes Given: 1502
Re: M&A: xAI, A SpaceX Company
« Reply #169 on: 12/09/2025 08:25 am »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.


Bingo.

~Jon

Which raises the question of why it's being said, and why now.

Whie I don't want to imply dishonesty* I think there is possibly a strong element of PR here. The "Moon as AI factory" thing only showed up after Acting Admin Duffy mentioned re-competing the Artemis HLS contract. This may be a move to show commitment to the Moon as a goal.

So far I don't think any money has been spent or hardware built on this Moon AI factory stuff, and it's probably quite a while out. If it looks like it won't work out (IMO the far more plausible outcome) it'll probably be quietly dropped.

.*I think it's an *honest* theoretical extreme - I absolutely think Elon Musk believes this is *possible*, at least eventually. That doesn't mean he thinks it's the most plausible or easiest path, though.
He's right on brand..  "At SpaceX we turn the impossible into the very late", etc.

100 GWatt/year (or a MTon) won't happen in 5 years.  But the previous best guess was "one day maybe my children will see this", so you know, suppose it takes 10.
 
ABCD - Always Be Counting Down

Offline scaesare

  • Member
  • Posts: 60
  • Liked: 68
  • Likes Given: 134
Re: M&A: xAI, A SpaceX Company
« Reply #170 on: 12/09/2025 12:33 pm »
https://x.com/CJHandmer/status/1997906033168330816

Quote
Here's one idea about SpaceX's next big thing. AI computing (inference) on orbit, but how the hell can SpaceX do this cheaper than just building more datacenters on the ground?

From first principles, it's an attractive proposition because the GPUs have extreme value per kg and extreme revenue per kW, both of which are relatively expensive. That is, the value prop somewhat washes out the pain of operating in space.

So I took a closer look. If anyone can make this work, it's a Starlink-derived system, so I started with the Starlink v3 satellites, with some high fidelity CEO-cad below.

Orbital parameters. Pick a sun synchronous orbit - so we're in full 1400 kW/m^2 sunlight at all times. No need for batteries. Deploy the solar array in "sun slicer" mode, facing full sun, but the edge is pointing in the orbital direction (bottom right in these images) to minimize drag. But the inference Starlink (Star Thought?) satellites don't have to scrape the atmosphere. Being in SSO, they'll need to use the rest of the Starlink constellation for backhaul via laser links anyway, and higher orbits actually improve worst case latency (serving equatorial customers around midnight or noon) very slightly. Too high though, and SSO is relatively full of debris. Let's pick 560 km.

A Starlink satellite in this orbit has full sun, so the back half is always shaded and relatively cool. The next hottest thing in the sky is the Earth, taking up almost half the sky to the bottom left in these images. So set up an MLI heat reflector there too, and then on the back side of the main bus we can use passive radiation to cool right down. The non-shiny parts of space are very cold. An array of this size produces about 130 kW of electrical power.

In this model, perhaps 200 H100-equivalent GPUs are racked on the main bus, generating 13,000 tokens per second, nowhere near enough to saturate a laser link. At $10/token, that's $4m of revenue per year. Assuming an all in cost of $50,000/kW (dominated by the GPUs on a kW or kg basis), that's something like 60% ROI per year. Obviously assumptions can vary.

But this neglects a broader point.

Starlink satellites have to do all kinds of things that inference satellites don't have to, in particular, talk to millions of customers all over the world. If most of our power is just going to inference, would we change the design?

Each solar array module produces about 6 kW. Each GPU consumes about 700 W. SpaceX could actually install GPUs directly to the solar module, matching optimal voltage and current levels and connecting each back to the main bus not with an insulated high voltage power cable, but a local wifi connection. There's then no real limit on how many modules can connect to each bus, which drives the overall power density of the system towards the thinnest solar arrays than can be flown, probably close to 1 kg/m^2 in the limit.

That is, the inference is performed by a thin piece of silicon that always faces the sun, connected to a somewhat smaller piece of thin silicon with billions of logic gates in it. This helps the thermal problem too, because we're not bringing a bunch of power into one concentrated place.

If one Starship can launch 100 T to LEO, then that gets close to 30 MW of inference per launch. 1000 launches is 30 GW. Now we're talking real scale - and provided the revenue per kWh is greater than about $4.00, I think the economics work out.

I've seen a bunch of high inclination Starlink launches from Vandenberg recently, but I don't think any of them were going to SSO.

In any case, a ring of inference satellites visible at dawn and dusk running north south will be awesome.

Note by $10/token he meant $10 per million tokens

He also says:
Quote
we're in full 1400 kW/m^2 sunlight at all times.

I think he's off by 3 orders of magnitude here...I think it's a unit problem, as it's about 1,400 watts per M^2, not kilowatts

Offline ZachF

  • Full Member
  • ****
  • Posts: 1930
  • Immensely complex & high risk
  • NH, USA, Earth
  • Liked: 3154
  • Likes Given: 646
Re: M&A: xAI, A SpaceX Company
« Reply #171 on: 12/10/2025 09:41 pm »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.

My guess is that we will see a 200x18m, >1000 tonne to LEO super starship announced after V4 if these plans are even semi-serious.
artist, so take opinions expressed above with a well-rendered grain of salt...
https://www.instagram.com/artzf/

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3397
  • Liked: 1510
  • Likes Given: 208
Re: M&A: xAI, A SpaceX Company
« Reply #172 on: 12/11/2025 12:50 am »
Yeah, 1,000,000 tons in orbit would probably be a couple decades away even by very aggressive growth assumptions.

Nothing scales that fast (5 in 2025 to 5000 in 2029-2030).

This feels like a theoretical extreme rather than a realistic plan.

My guess is that we will see a 200x18m, >1000 tonne to LEO super starship announced after V4 if these plans are even semi-serious.

I'm not sure how testable or launchable that would be. Maybe very, very far offshore ...

Offline thespacecow

  • Full Member
  • ****
  • Posts: 1302
  • e/acc
  • Liked: 1225
  • Likes Given: 530
Re: M&A: xAI, A SpaceX Company
« Reply #173 on: 12/11/2025 03:36 am »
https://x.com/Robotbeat/status/1998765107510137119

Quote
I’d be surprised if SpaceX doesn’t start launching megawatts of AI compute into space by 2027. Most people balking at the idea of datacenters in space haven’t grappled much with Starlink even as it is today. They’re grounding their expectations on traditional satellites.

Heck, Megawatts of inference by end of 2026 is also pretty likely. Remember they’re planning to put this on basically modified Starlinks launched on Starship.

A single Starship launch full of modded Starlinks will be a few Megawatts. It’s just a question of whether they’ll prioritize putting inference on the first dozen or so Starlink Starship launches or not.


https://x.com/Robotbeat/status/1998791259939549469

Quote
People gonna be mad when AI compute in space is just weird Starlinks starting 2026/7. By the time we actually get large single datacenters in space, the idea of substantial AI compute in orbit will already have been normalized & people will forget that they totally dismissed it

Offline thespacecow

  • Full Member
  • ****
  • Posts: 1302
  • e/acc
  • Liked: 1225
  • Likes Given: 530
Re: M&A: xAI, A SpaceX Company
« Reply #174 on: 12/11/2025 03:37 am »
https://x.com/elonmusk/status/1998872465087541752

Quote
SpaceX has way more satellites in orbit than the rest of the world combined, so maybe we know a thing or two about the subject 🤣

Starlink V3 will be 20kW and launched at scale around Q4 next year. No problem to scale that to >100kW if the satellite mass is shifted towards solar arrays and radiators for AI compute, instead of giant phased array antennas for Internet connectivity.

As you mention, it would use the same laser comms system as Starlink to connect to Starlink.

An AI satellite is easier, not harder, than the Starlink V3 design, which is a marvel of engineering created by an epic team of humans. I am so proud of the @SpaceX team.

Offline DigitalMan

  • Full Member
  • ****
  • Posts: 1805
  • Liked: 1261
  • Likes Given: 76
Re: M&A: xAI, A SpaceX Company
« Reply #175 on: 12/11/2025 02:49 pm »
There was a follow-up to that after someone mentioned 100kw isn’t a full rack of H100s, he replied 150kw would also be fine.

There is a lot of discussion on X about the whole subject

Offline spacenut

  • Senior Member
  • *****
  • Posts: 5954
  • East Alabama
  • Liked: 2914
  • Likes Given: 3639
Re: M&A: xAI, A SpaceX Company
« Reply #176 on: 12/11/2025 03:16 pm »
I picture the cloud for data centers in a higher polar orbit orbiting the earth parallel to earths rotation around the sun.  Always in sunlight for continuous power in a ring.  With SpaceX and others contemplating the same thing, we may have a ring around the earth like Saturn however in a polar orbit.  We may be able to see this ring in dawn and dusk.  Being in a ring, it will be able to reach everyone at all times.  12 noon and 12 midnight will be the longest distance from earth, but it could relay signals via Starlink at anytime back to earth.  Starlinks might be the go between relays for these data centers. 

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3397
  • Liked: 1510
  • Likes Given: 208
Re: M&A: xAI, A SpaceX Company
« Reply #177 on: 12/11/2025 03:44 pm »
https://x.com/Robotbeat/status/1998765107510137119

Quote
I’d be surprised if SpaceX doesn’t start launching megawatts of AI compute into space by 2027. Most people balking at the idea of datacenters in space haven’t grappled much with Starlink even as it is today. They’re grounding their expectations on traditional satellites.

Heck, Megawatts of inference by end of 2026 is also pretty likely. Remember they’re planning to put this on basically modified Starlinks launched on Starship.

A single Starship launch full of modded Starlinks will be a few Megawatts. It’s just a question of whether they’ll prioritize putting inference on the first dozen or so Starlink Starship launches or not.


https://x.com/Robotbeat/status/1998791259939549469

Quote
People gonna be mad when AI compute in space is just weird Starlinks starting 2026/7. By the time we actually get large single datacenters in space, the idea of substantial AI compute in orbit will already have been normalized & people will forget that they totally dismissed it

See, the very near term part of the plan, "compute on Starlink v3" seems entirely plausible to me.

It's the longer term expectation of continued exponential growth in demand, scaling up to terawatts of AI use, that doesn't.

Offline novo2044

  • Full Member
  • ***
  • Posts: 304
  • USA
  • Liked: 512
  • Likes Given: 70
Re: M&A: xAI, A SpaceX Company
« Reply #178 on: 12/11/2025 07:55 pm »
https://x.com/Robotbeat/status/1998765107510137119

Quote
I’d be surprised if SpaceX doesn’t start launching megawatts of AI compute into space by 2027. Most people balking at the idea of datacenters in space haven’t grappled much with Starlink even as it is today. They’re grounding their expectations on traditional satellites.

Heck, Megawatts of inference by end of 2026 is also pretty likely. Remember they’re planning to put this on basically modified Starlinks launched on Starship.

A single Starship launch full of modded Starlinks will be a few Megawatts. It’s just a question of whether they’ll prioritize putting inference on the first dozen or so Starlink Starship launches or not.


https://x.com/Robotbeat/status/1998791259939549469

Quote
People gonna be mad when AI compute in space is just weird Starlinks starting 2026/7. By the time we actually get large single datacenters in space, the idea of substantial AI compute in orbit will already have been normalized & people will forget that they totally dismissed it

See, the very near term part of the plan, "compute on Starlink v3" seems entirely plausible to me.

It's the longer term expectation of continued exponential growth in demand, scaling up to terawatts of AI use, that doesn't.
Musk hates beamed solar, but for all its inefficiencies, if you have Terawatts of it, you can probably build a case for it even if the AI doesn't pan out.

Offline thespacecow

  • Full Member
  • ****
  • Posts: 1302
  • e/acc
  • Liked: 1225
  • Likes Given: 530
Re: M&A: xAI, A SpaceX Company
« Reply #179 on: 12/12/2025 03:56 am »
See, the very near term part of the plan, "compute on Starlink v3" seems entirely plausible to me.

It's the longer term expectation of continued exponential growth in demand, scaling up to terawatts of AI use, that doesn't.

As long as it's constellation based, they can always stop launching if demand dries up, this is true for the broadband/D2D business as well. One could also ask if there's enough demand for 40,000 broadband satellites and 15,000 D2D satellites, we don't know, but they can grow the constellation gradually until they hit the limit, so I don't see this is big risk.

There's more risk if they start investing in lunar factories, a lot depends on whether they can find intermediate business to support the buildout without solely rely on AI revenue. This is what I wanted to discuss in the lunar thread in the Starship section, but that thread got derailed by people arguing about feasibility of orbital data centers, oh well...

Tags:
 

Advertisement NovaTech
Advertisement
Advertisement Margaritaville Beach Resort South Padre Island
Advertisement Brady Kenniston
Advertisement NextSpaceflight
Advertisement Nathan Barker Photography
1