Author Topic: Elon Musk’s new Space AI ambitions  (Read 44377 times)

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3227
  • Liked: 1427
  • Likes Given: 196
Re: Elon Musk’s new Space AI ambitions
« Reply #20 on: 12/03/2025 03:09 pm »
So fast forward to AI. You think the type of tasks it does is exhausted,

No, that's not what I'm saying at all, here.

I'm saying that even if tons of new applications are invented there's still a fairly hard upper limit which is ultimately more or less set by the number of "technologically connected" people. And that hard limit, at least in the next century or so, is probably below terawatt level use.

Especially with efficiency improvements. And I think fairly dramatic efficiency improvements are ultimately necessary to make widespread use of AI cost effective if/when something closer to the true cost is passed on to the user, which currently often isn't the case.

Quote
I'm pretty sure "TettaWatts" was not thrown around without regard.

Oh, I think there's thought behind it. But I think it's a hypothetical of what can be done assuming unbounded demand (as well as not hitting any limits in scaling up manufacture, etc.) It's perfectly valid in that context, I just don't think that's the most likely scenario.



Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3301
  • Liked: 2907
  • Likes Given: 12056
Re: Elon Musk’s new Space AI ambitions
« Reply #21 on: 12/03/2025 03:20 pm »
Elon has spoken about inference tasks, with the reason being that a short delay is acceptable
Other way around. Inference requires low latency. Training (which can take months) does not.

You're talking about different latencies.  Many or most inference jobs accept some latency to and from the user.  On the other hand, training jobs require extremely low latencies among the training CPUs in a coherent cluster.

Inference jobs calling agents/tools and moving data to/from the training cluster are separate discussions.

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17563
  • N. California
  • Liked: 17881
  • Likes Given: 1502
Re: Elon Musk’s new Space AI ambitions
« Reply #22 on: 12/03/2025 05:57 pm »
So fast forward to AI. You think the type of tasks it does is exhausted,

No, that's not what I'm saying at all, here.

I'm saying that even if tons of new applications are invented there's still a fairly hard upper limit which is ultimately more or less set by the number of "technologically connected" people. And that hard limit, at least in the next century or so, is probably below terawatt level use.

Especially with efficiency improvements. And I think fairly dramatic efficiency improvements are ultimately necessary to make widespread use of AI cost effective if/when something closer to the true cost is passed on to the user, which currently often isn't the case.

Quote
I'm pretty sure "TettaWatts" was not thrown around without regard.

Oh, I think there's thought behind it. But I think it's a hypothetical of what can be done assuming unbounded demand (as well as not hitting any limits in scaling up manufacture, etc.) It's perfectly valid in that context, I just don't think that's the most likely scenario.
But doesn't that limit mean you're imagining only a "person interacting with an AI" kind of application?

What if the AI is driving robotics?  Or vehicles? Or corporate AI workers?

So many thoughts to be thought....

And even with people connectivity, what is the hard limit? A kWatt per person?

It adds up very quickly, I think.

Ugh. I forgot about the military. A lot of the people that don't have connectivity might actually consume AI compute power passively.
« Last Edit: 12/03/2025 06:33 pm by meekGee »
ABCD - Always Be Counting Down

Offline DanielW

  • Full Member
  • ****
  • Posts: 635
  • L-22
  • Liked: 581
  • Likes Given: 91
Re: Elon Musk’s new Space AI ambitions
« Reply #23 on: 12/03/2025 06:01 pm »
I think Musk is talking inference not training for the initial build out.

Training data-centers need to be colocated with extremely high-speed interconnects between nodes. This requires On-Orbit assembly.

Inference also requires a great deal of power and compute because of the vast numbers of users, but it can be distributed. There is some baseline requirements driven by the size of the model, but generally a single query can be handled by a single rack or even a single GPU.

This means that you can off-load electricity consumption to space as fast as you can launch, without On-Orbit assembly, by just putting GPUs on a starlink satellite bus. You can do this until you saturate the need for inference. This frees up resources and goodwill on earth to be used for training.

Once inference has bootstrapped the process, you can worry about assembling training centers.

So I think Musk is thinking about what can we do easily right now rather than what makes the most sense from a pure physics standpoint.

Edit to add this from the MIT Technology Review. https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

"It’s now estimated that 80–90% of computing power for AI is used for inference."
« Last Edit: 12/03/2025 06:10 pm by DanielW »

Offline launchwatcher

  • Full Member
  • ****
  • Posts: 833
  • Liked: 817
  • Likes Given: 1238
Re: Elon Musk’s new Space AI ambitions
« Reply #24 on: 12/03/2025 06:47 pm »
I think Musk is talking inference not training for the initial build out.

Training data-centers need to be colocated with extremely high-speed interconnects between nodes. This requires On-Orbit assembly.
Google is investigating whether they can avoid on-orbit assembly by instead using free-space optical data links between satellites flying in close (under 1km) formation:

https://research.google/blog/exploring-a-space-based-scalable-ai-infrastructure-system-design/

Quote
At the altitude of our planned constellation, the non-sphericity of Earth's gravitational field, and potentially atmospheric drag, are the dominant non-Keplerian effects impacting satellite orbital dynamics. In the figure below, we show trajectories (over one full orbit) for an illustrative 81-satellite constellation configuration in the orbital plane, at a mean cluster altitude of 650 km. The cluster radius is R=1 km, with the distance between next-nearest-neighbor satellites oscillating between ~100–200m, under the influence of Earth’s gravity.

The models show that, with satellites positioned just hundreds of meters apart, we will likely only require modest station-keeping maneuvers to maintain stable constellations within our desired sun-synchronous orbit.

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3227
  • Liked: 1427
  • Likes Given: 196
Re: Elon Musk’s new Space AI ambitions
« Reply #25 on: 12/03/2025 07:04 pm »
So fast forward to AI. You think the type of tasks it does is exhausted,

No, that's not what I'm saying at all, here.

I'm saying that even if tons of new applications are invented there's still a fairly hard upper limit which is ultimately more or less set by the number of "technologically connected" people. And that hard limit, at least in the next century or so, is probably below terawatt level use.

Especially with efficiency improvements. And I think fairly dramatic efficiency improvements are ultimately necessary to make widespread use of AI cost effective if/when something closer to the true cost is passed on to the user, which currently often isn't the case.

Quote
I'm pretty sure "TettaWatts" was not thrown around without regard.

Oh, I think there's thought behind it. But I think it's a hypothetical of what can be done assuming unbounded demand (as well as not hitting any limits in scaling up manufacture, etc.) It's perfectly valid in that context, I just don't think that's the most likely scenario.
But doesn't that limit mean you're imagining only a "person interacting with an AI" kind of application?

What if the AI is driving robotics?  Or vehicles? Or corporate AI workers?

That raises the limit but does not change the fundamentals. Those robots building stuff or vehicles delivering stuff are still ultimately providing goods and services to people. The total demand in the economy is still based on people.

"Technologically connected" people in this context doesn't necessarily just mean an internet connection, it means people who have decent access to/participation in the technological world economy (which isn't everyone).

There's no point in having automated factories building things there is no demand for. It's physically possible, but doesn't make sense.

Also I think the cost issues still hit. I don't think AI doing a lot of those things will end up being cost effective at current total costs, once (or if) those costs are passed on to the end user.

So the terawatt+ scenario isn't impossible, no. But I think it requires a fairly narrow needle to be threaded. If AI gets much more efficient, you don't get terawatt power use; if AI doesn't get more efficient, its use probably becomes more limited due to cost. There's a very narrow range in there (assuming those two scenarios don't overlap entirely, leaving no range) where AI is still energy hungry enough to demand terawatt+ power but cheap enough to be used widely once real costs catch up.

(And that's assuming other factors not directly related to the technology itself don't get in the way. Which strikes me as likely.)
« Last Edit: 12/03/2025 07:16 pm by Vultur »

Offline DanielW

  • Full Member
  • ****
  • Posts: 635
  • L-22
  • Liked: 581
  • Likes Given: 91
Re: Elon Musk’s new Space AI ambitions
« Reply #26 on: 12/03/2025 07:14 pm »
I think Musk is talking inference not training for the initial build out.

Training data-centers need to be colocated with extremely high-speed interconnects between nodes. This requires On-Orbit assembly.
Google is investigating whether they can avoid on-orbit assembly by instead using free-space optical data links between satellites flying in close (under 1km) formation:

https://research.google/blog/exploring-a-space-based-scalable-ai-infrastructure-system-design/

Quote
At the altitude of our planned constellation, the non-sphericity of Earth's gravitational field, and potentially atmospheric drag, are the dominant non-Keplerian effects impacting satellite orbital dynamics. In the figure below, we show trajectories (over one full orbit) for an illustrative 81-satellite constellation configuration in the orbital plane, at a mean cluster altitude of 650 km. The cluster radius is R=1 km, with the distance between next-nearest-neighbor satellites oscillating between ~100–200m, under the influence of Earth’s gravity.

The models show that, with satellites positioned just hundreds of meters apart, we will likely only require modest station-keeping maneuvers to maintain stable constellations within our desired sun-synchronous orbit.

I think Musk is a step ahead. Since most power consumption is inference he can launch that without the added complexity of tight interconnects. If built into larger Starlink satellites, you increase the addressable market for Starlink while having built-in data connects for the AI workloads.

It will also be hard to beat the latency since the inference will most likely happen right over your head and beamed straight back down. (not as good as living close to a datacenter)

Offline JayWee

  • Full Member
  • ****
  • Posts: 1135
  • Liked: 1141
  • Likes Given: 2727
Re: Elon Musk’s new Space AI ambitions
« Reply #27 on: 12/03/2025 08:15 pm »
It will also be hard to beat the latency since the inference will most likely happen right over your head and beamed straight back down. (not as good as living close to a datacenter)
This. Especially for robots like Optimus or drones.

That raises the limit but does not change the fundamentals. Those robots building stuff or vehicles delivering stuff are still ultimately providing goods and services to people. The total demand in the economy is still based on people.

"Technologically connected" people in this context doesn't necessarily just mean an internet connection, it means people who have decent access to/participation in the technological world economy (which isn't everyone).

There's no point in having automated factories building things there is no demand for. It's physically possible, but doesn't make sense.
Ultimately people benefit, yes. But the customers might not be nearby. There are people remote driving mine excavators from 1500km away in Australia right now.
With global low-latency inference, you can easily answer the call to "We want to deploy thousands of robots in the middle of Congo" by simply flying them in, instead of having to plan all the necessary support infrastructure locally taking months/years.



Offline Vultur

  • Senior Member
  • *****
  • Posts: 3227
  • Liked: 1427
  • Likes Given: 196
Re: Elon Musk’s new Space AI ambitions
« Reply #28 on: 12/03/2025 08:39 pm »
That raises the limit but does not change the fundamentals. Those robots building stuff or vehicles delivering stuff are still ultimately providing goods and services to people. The total demand in the economy is still based on people.

"Technologically connected" people in this context doesn't necessarily just mean an internet connection, it means people who have decent access to/participation in the technological world economy (which isn't everyone).

There's no point in having automated factories building things there is no demand for. It's physically possible, but doesn't make sense.
Ultimately people benefit, yes. But the customers might not be nearby. There are people remote driving mine excavators from 1500km away in Australia right now.
With global low-latency inference, you can easily answer the call to "We want to deploy thousands of robots in the middle of Congo" by simply flying them in, instead of having to plan all the necessary support infrastructure locally taking months/years.

Of course.

But I am talking about an upper limit set by total world demand, not local demand.

1TW (about 1/3 of current total world electricity use) is admittedly imaginable ... IF AI tech threads the narrow needle between "much greater efficiency means less energy need" and "too expensive to be used in everything once full costs are passed on to end users". And IF practicalities of building it all (and making the factories to build it all, etc) don't get in the way, and IF external factors (political backlash against AI affecting job markets, etc) don't get in the way.

I don't know where that upper limit is. But I think there has to be one.
« Last Edit: 12/03/2025 08:44 pm by Vultur »

Offline DigitalMan

  • Full Member
  • ****
  • Posts: 1802
  • Liked: 1260
  • Likes Given: 76
Re: Elon Musk’s new Space AI ambitions
« Reply #29 on: 12/03/2025 11:19 pm »
I think Musk is talking inference not training for the initial build out.

Training data-centers need to be colocated with extremely high-speed interconnects between nodes. This requires On-Orbit assembly.
Google is investigating whether they can avoid on-orbit assembly by instead using free-space optical data links between satellites flying in close (under 1km) formation:

https://research.google/blog/exploring-a-space-based-scalable-ai-infrastructure-system-design/

Quote
At the altitude of our planned constellation, the non-sphericity of Earth's gravitational field, and potentially atmospheric drag, are the dominant non-Keplerian effects impacting satellite orbital dynamics. In the figure below, we show trajectories (over one full orbit) for an illustrative 81-satellite constellation configuration in the orbital plane, at a mean cluster altitude of 650 km. The cluster radius is R=1 km, with the distance between next-nearest-neighbor satellites oscillating between ~100–200m, under the influence of Earth’s gravity.

The models show that, with satellites positioned just hundreds of meters apart, we will likely only require modest station-keeping maneuvers to maintain stable constellations within our desired sun-synchronous orbit.

I think Musk is a step ahead. Since most power consumption is inference he can launch that without the added complexity of tight interconnects. If built into larger Starlink satellites, you increase the addressable market for Starlink while having built-in data connects for the AI workloads.

It will also be hard to beat the latency since the inference will most likely happen right over your head and beamed straight back down. (not as good as living close to a datacenter)

I suspect the primary benefit he will see is reduced workloads running in 3rd party datacenters.

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17563
  • N. California
  • Liked: 17881
  • Likes Given: 1502
Re: Elon Musk’s new Space AI ambitions
« Reply #30 on: 12/04/2025 05:50 am »
So fast forward to AI. You think the type of tasks it does is exhausted,

No, that's not what I'm saying at all, here.

I'm saying that even if tons of new applications are invented there's still a fairly hard upper limit which is ultimately more or less set by the number of "technologically connected" people. And that hard limit, at least in the next century or so, is probably below terawatt level use.

Especially with efficiency improvements. And I think fairly dramatic efficiency improvements are ultimately necessary to make widespread use of AI cost effective if/when something closer to the true cost is passed on to the user, which currently often isn't the case.

Quote
I'm pretty sure "TettaWatts" was not thrown around without regard.

Oh, I think there's thought behind it. But I think it's a hypothetical of what can be done assuming unbounded demand (as well as not hitting any limits in scaling up manufacture, etc.) It's perfectly valid in that context, I just don't think that's the most likely scenario.
But doesn't that limit mean you're imagining only a "person interacting with an AI" kind of application?

What if the AI is driving robotics?  Or vehicles? Or corporate AI workers?

That raises the limit but does not change the fundamentals. Those robots building stuff or vehicles delivering stuff are still ultimately providing goods and services to people. The total demand in the economy is still based on people.

"Technologically connected" people in this context doesn't necessarily just mean an internet connection, it means people who have decent access to/participation in the technological world economy (which isn't everyone).

There's no point in having automated factories building things there is no demand for. It's physically possible, but doesn't make sense.

Also I think the cost issues still hit. I don't think AI doing a lot of those things will end up being cost effective at current total costs, once (or if) those costs are passed on to the end user.

So the terawatt+ scenario isn't impossible, no. But I think it requires a fairly narrow needle to be threaded. If AI gets much more efficient, you don't get terawatt power use; if AI doesn't get more efficient, its use probably becomes more limited due to cost. There's a very narrow range in there (assuming those two scenarios don't overlap entirely, leaving no range) where AI is still energy hungry enough to demand terawatt+ power but cheap enough to be used widely once real costs catch up.

(And that's assuming other factors not directly related to the technology itself don't get in the way. Which strikes me as likely.)
But if I'm understanding it correctly, you're saying that the breadth of AI activity is proportional to the population (e.g. times "participation fraction" times "utilization factor" etc..) so it can't grow exponentially.

Which is true.

But the utilization factor - AI footprint per participating person - that factor can still grow by 1000x or a 1,000,000x from what it is today.

There will be a ceiling, we just don't have any idea where that ceiling is.

I mean - look at semiconductor technology.  It's capped, but we're not close to scratching the limit, even 70 years in.
ABCD - Always Be Counting Down

Online volker2020

  • Full Member
  • ***
  • Posts: 353
  • Frankfurt, Germany
  • Liked: 374
  • Likes Given: 950
Re: Elon Musk’s new Space AI ambitions
« Reply #31 on: 12/04/2025 06:29 am »
Since we are all number persons here. Let's lay out some numbers and let us think:

One traditional GWh data center for AI on earth has initial setup costs of around 80.000.000.000$ and has self live of around 5 years, before the hardware has to be replace.

I am sure that a clever man with a lot of money can reduce that cost by using custom self produced AI chips and mass production, by some margin but assuming that will be more than 50% does not sound very realistic, especially thinking of the need to space hardening the whole setup, putting it into a rocket, make it save to 2g of acceleration .... Than there will be the extra costs of moving that hardware.

At the end, I guess we can agree that even achieving 100 Billion $ per data center is a doubtful task. So basically each of this data centers in space need to produce  around 20 Billion of revenue per year. So there must be a productivity gain, greater or at least equal to this cost.

But the productivity gain is severely limited by what some call hallucinations,  or people like me call unavoidable consequences of the underlying algorithm of all current generative AIs. As long as this (let's call it flaw of the underlying statistical model), is not solved (and that would need one really genius idea, that is not to predict so far) the usability of AI has it's limit. 3 years after introduction, there is still no company making serious money with this. Open AI currently make a minus of 15 Billion per quarter.  So who will pay for this. School children, cheating with there homework clearly is not the answer. In my line of work, I can realize a productivity increase of around 5% using AI, which is nothing to sneeze at, but is far off the promises given by the AI executives.

So no, I don't believe that much will happen in this round. That might change in the future, but that future has not presented itself so far. 

Offline launchwatcher

  • Full Member
  • ****
  • Posts: 833
  • Liked: 817
  • Likes Given: 1238
Re: Elon Musk’s new Space AI ambitions
« Reply #32 on: 12/04/2025 02:47 pm »
But wait, there's more:

Quote
Sam Altman Has Explored Deal to Build Competitor to Elon Musk’s SpaceX

The OpenAI CEO has publicly talked about the possibility of building ‘a rocket company’ and the potential for developing data centers in space.

OpenAI Chief Executive Sam Altman has explored putting together funds to either acquire or partner with a rocket company, a move that would position him to compete against Elon Musk’s SpaceX.

Altman reached out to at least one rocket maker, Stoke Space, in the summer, and the discussions picked up in the fall, according to people familiar with the talks. Among the proposals was for OpenAI to make a series of equity investments in the company and end up with a controlling stake. Such an investment would total billions of dollars over time.
If you subscribe to the WSJ, see: https://www.wsj.com/tech/ai/sam-altman-has-explored-deal-to-build-competitor-to-elon-musks-spacex-01574ff7

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3301
  • Liked: 2907
  • Likes Given: 12056
Re: Elon Musk’s new Space AI ambitions
« Reply #33 on: 12/04/2025 02:53 pm »
Since we are all number persons here. Let's lay out some numbers and let us think:

One traditional GWh data center for AI on earth has initial setup costs of around 80.000.000.000$ and has self live of around 5 years, before the hardware has to be replace.

I am sure that a clever man with a lot of money can reduce that cost by using custom self produced AI chips and mass production, by some margin but assuming that will be more than 50% does not sound very realistic, especially thinking of the need to space hardening the whole setup, putting it into a rocket, make it save to 2g of acceleration .... Than there will be the extra costs of moving that hardware.

It's GW (gigawatt), not GWh (gigawatt-hours).  Regardless, the cost reduction for using self produced AI chips can be quite dramatic, and much more than 50%.  This is because NVidia has 73% gross margins.  Of course, this is true whether we are talking terrestrial or orbital.

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3301
  • Liked: 2907
  • Likes Given: 12056
Re: Elon Musk’s new Space AI ambitions
« Reply #34 on: 12/04/2025 02:59 pm »
But wait, there's more:

Quote
Sam Altman Has Explored Deal to Build Competitor to Elon Musk’s SpaceX

Musk and Altman are battling each other and Google/Microsoft/Meta/NVidia/Broadcom by threatening to verticalize the necessary supply chains.  If threats of vertical supply chains are the game, then Musk is the master by being able to extend it from the fab all the way to space, with integrated power generation and power management.

As an observer, I take all of these threats with a grain of salt and make allowances for posturing.  But I keep an open mind.  The space industry might benefit a bit through add'l investment because of all of the posturing, even if ultimately space is not part of the game.  Unsure.
« Last Edit: 12/04/2025 03:24 pm by RedLineTrain »

Offline wes_wilson

  • Armchair Rocketeer
  • Full Member
  • ****
  • Posts: 513
  • Florida
    • Foundations IT, Inc.
  • Liked: 582
  • Likes Given: 399
Re: Elon Musk’s new Space AI ambitions
« Reply #35 on: 12/04/2025 04:56 pm »
Will be a wild future if AI becomes the "thing" in space that closes the business case for becoming a spacefaring civilization. 
@SpaceX "When can I buy my ticket to Mars?"

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3227
  • Liked: 1427
  • Likes Given: 196
Re: Elon Musk’s new Space AI ambitions
« Reply #36 on: 12/04/2025 05:05 pm »
So fast forward to AI. You think the type of tasks it does is exhausted,

No, that's not what I'm saying at all, here.

I'm saying that even if tons of new applications are invented there's still a fairly hard upper limit which is ultimately more or less set by the number of "technologically connected" people. And that hard limit, at least in the next century or so, is probably below terawatt level use.

Especially with efficiency improvements. And I think fairly dramatic efficiency improvements are ultimately necessary to make widespread use of AI cost effective if/when something closer to the true cost is passed on to the user, which currently often isn't the case.

Quote
I'm pretty sure "TettaWatts" was not thrown around without regard.

Oh, I think there's thought behind it. But I think it's a hypothetical of what can be done assuming unbounded demand (as well as not hitting any limits in scaling up manufacture, etc.) It's perfectly valid in that context, I just don't think that's the most likely scenario.
But doesn't that limit mean you're imagining only a "person interacting with an AI" kind of application?

What if the AI is driving robotics?  Or vehicles? Or corporate AI workers?

That raises the limit but does not change the fundamentals. Those robots building stuff or vehicles delivering stuff are still ultimately providing goods and services to people. The total demand in the economy is still based on people.

"Technologically connected" people in this context doesn't necessarily just mean an internet connection, it means people who have decent access to/participation in the technological world economy (which isn't everyone).

There's no point in having automated factories building things there is no demand for. It's physically possible, but doesn't make sense.

Also I think the cost issues still hit. I don't think AI doing a lot of those things will end up being cost effective at current total costs, once (or if) those costs are passed on to the end user.

So the terawatt+ scenario isn't impossible, no. But I think it requires a fairly narrow needle to be threaded. If AI gets much more efficient, you don't get terawatt power use; if AI doesn't get more efficient, its use probably becomes more limited due to cost. There's a very narrow range in there (assuming those two scenarios don't overlap entirely, leaving no range) where AI is still energy hungry enough to demand terawatt+ power but cheap enough to be used widely once real costs catch up.

(And that's assuming other factors not directly related to the technology itself don't get in the way. Which strikes me as likely.)
But if I'm understanding it correctly, you're saying that the breadth of AI activity is proportional to the population (e.g. times "participation fraction" times "utilization factor" etc..) so it can't grow exponentially.

Which is true.


That's part of what I'm saying, but not all of it.

The other half is that (once things settle out and it becomes clear what "AI" works best for what applications*) AI will presumably only be used when it's the more cost effective way of doing things.

So cost effectiveness sets a limit on that utilization factor. If the energy (and hardware) cost of the AI is greater than the cost of doing the same thing without AI, a rational business won't use AI for that task.

I think that if AI energy costs don't decrease a lot, it won't prove to be cost-effective for a lot of things once full costs are passed on.


*Which may well mean a move away from LLMs to more specialized, efficient, and reliable systems, using far less energy

Quote
But the utilization factor - AI footprint per participating person - that factor can still grow by 1000x or a 1,000,000x from what it is today.

This is what I disagree with, at least if you define footprint in terms of energy use. AI using that much energy won't be cost effective. Not even with space solar power.

Datacenters are already using several percent of total electricity use. Though some of that is non-AI stuff ... But AI is definitely well into the gigawatts. 1,000,000x would be well into petawatts, probably more than 1MW/person even at a hypothetical peak world population of maybe 10-11B. I don't see how it can be cost effective for 99.7% of civilization's energy use to be AI.

Quote
I mean - look at semiconductor technology.  It's capped, but we're not close to scratching the limit, even 70 years in.

We may not be close to the physical limit but we're into the diminishing returns phase. Moore's Law has effectively ended.

IMO we are (as a civilization) now investing far too much of our R&D effort into computer/IT technologies rather than technologies which are earlier on their S curve, where investment would be far more beneficial.
« Last Edit: 12/04/2025 05:14 pm by Vultur »

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17563
  • N. California
  • Liked: 17881
  • Likes Given: 1502
Re: Elon Musk’s new Space AI ambitions
« Reply #37 on: 12/05/2025 05:05 am »
That's part of what I'm saying, but not all of it.

The other half is that (once things settle out and it becomes clear what "AI" works best for what applications*) AI will presumably only be used when it's the more cost effective way of doing things.

So cost effectiveness sets a limit on that utilization factor. If the energy (and hardware) cost of the AI is greater than the cost of doing the same thing without AI, a rational business won't use AI for that task.

I think that if AI energy costs don't decrease a lot, it won't prove to be cost-effective for a lot of things once full costs are passed on.


*Which may well mean a move away from LLMs to more specialized, efficient, and reliable systems, using far less energy

Quote
But the utilization factor - AI footprint per participating person - that factor can still grow by 1000x or a 1,000,000x from what it is today.

This is what I disagree with, at least if you define footprint in terms of energy use. AI using that much energy won't be cost effective. Not even with space solar power.

Datacenters are already using several percent of total electricity use. Though some of that is non-AI stuff ... But AI is definitely well into the gigawatts. 1,000,000x would be well into petawatts, probably more than 1MW/person even at a hypothetical peak world population of maybe 10-11B. I don't see how it can be cost effective for 99.7% of civilization's energy use to be AI.

Quote
I mean - look at semiconductor technology.  It's capped, but we're not close to scratching the limit, even 70 years in.

We may not be close to the physical limit but we're into the diminishing returns phase. Moore's Law has effectively ended.

IMO we are (as a civilization) now investing far too much of our R&D effort into computer/IT technologies rather than technologies which are earlier on their S curve, where investment would be far more beneficial.
So lets split it into "doing the same things but cheaper", and "doing new things".  (Typed A and B for this discussion)

Obvious analogies: thermodynamic engines allowed us to do sailing ships and horse drawn carriages better, but also allowed us to do airplanes and rockets.

Semiconductor tech allowed us to do radios better (replacing vacuum tubes) but then went on to create a couple other things.

Right now AI does some of both A and B.

Your argument applies only to type A, though in many situations AI is much more power intensive, but still "does it better".

Type B meanwhile is entirely outside the scope of your argument.

As per the analogies, cars are better than horses even if they consume more energy per mile, because they're faster and more dependable.  So there are more cars than there ever were horse-drawn carriages.  Meanwhile airplanes and rockets enabled entire sectors of industry that were not imaginable in the late 19th and early 20th century.
« Last Edit: 12/05/2025 09:12 pm by meekGee »
ABCD - Always Be Counting Down

Offline Vultur

  • Senior Member
  • *****
  • Posts: 3227
  • Liked: 1427
  • Likes Given: 196
Re: Elon Musk’s new Space AI ambitions
« Reply #38 on: 12/05/2025 05:57 am »
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear.

What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.

Offline meekGee

  • Senior Member
  • *****
  • Posts: 17563
  • N. California
  • Liked: 17881
  • Likes Given: 1502
Re: Elon Musk’s new Space AI ambitions
« Reply #39 on: 12/05/2025 07:18 am »
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear.

What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.
I'm pretty sure that it'll go down a lot faster than it did with thermodynamic engines.
ABCD - Always Be Counting Down

Tags:
 

Advertisement NovaTech
Advertisement
Advertisement Margaritaville Beach Resort South Padre Island
Advertisement Brady Kenniston
Advertisement NextSpaceflight
Advertisement Nathan Barker Photography
1