Again, in earlier times you could've easily argued that "energy use can't grow further" because there isn't enough wood that grows in all the world's forests. We all know why that prediction was wrong (coal).Similarly, saying "power consumption on earth can't increase" ignores the fact that we can use power in space to serve needs on Earth (which, reminder, is the subject of this thread).How many times have people predicted the "end of growth?" How many times have they been wrong, so far?
Quote from: Twark_Main on 12/18/2025 05:32 pmAgain, in earlier times you could've easily argued that "energy use can't grow further" because there isn't enough wood that grows in all the world's forests. We all know why that prediction was wrong (coal).Similarly, saying "power consumption on earth can't increase" ignores the fact that we can use power in space to serve needs on Earth (which, reminder, is the subject of this thread).How many times have people predicted the "end of growth?" How many times have they been wrong, so far? Supply vs demand though.I agree it has historically been a very bad bet that technology won't increase supply to match demand, e.g. the mid 20th century fears of vast overpopulation famines.Transition from wood to coal to oil to current energy mix increased supply, but demand was already increasing rapidly due to both rapid population growth and more and more of the world population being brought into the world economy.Given current falling population growth rates, and the pace of industrialization in the highest population parts of the world such as China and India, and the fact that new technologies are often more energy efficient to serve the same needs (with current LLMs being a huge exception to that general trend, admittedly) it is entirely plausible that world growth in energy *demand* will end by the middle of this century or so. I think datacenters can raise the peak but not extend the timing ... Either vastly more efficient ways to run "AI" in some sense (LLMs or not) will be found, or costs will increase sufficiently that many current uses will be non viable. And I think one or the other will happen way before 2050.If Space Datacenters do end up being cheaper ... Well they're still not free. Question is whether they can be cheap enough to be funded by advertising money and actual subscriptions. AI still needs to be supported by the rest of the economy.That doesn't mean it's forever, but it does mean future demand growth would not be driven by a continuation of current trends. It would have to be by either a reversal of current trends (e.g. if total human population starts increasing again either due to cultural or socioeconomic changes or off-earth settlements growing on their own) or something entirely new (e.g. if super projects like an interstellar probe launching laser are built).
World average per capita energy consumption is 2500W primary energy. US per capita primary energy consumption is 4 times that, 10kW.If you spend an hour flying a Learjet per day, that’s about 25 times that, 250kW.I think there’s still quite a lot of room for global energy use to increase without more population growth. Waste heat will become a constraint even if it’s all carbon neutral.
Quote from: Robotbeat on 12/18/2025 09:26 pmWorld average per capita energy consumption is 2500W primary energy. US per capita primary energy consumption is 4 times that, 10kW.If you spend an hour flying a Learjet per day, that’s about 25 times that, 250kW.I think there’s still quite a lot of room for global energy use to increase without more population growth. Waste heat will become a constraint even if it’s all carbon neutral.How much room? 10x? 100x? I'm ok with all those. The future is wild and unpredictable.It's the 100,000,000x I took issue with.And yes, everything you said about energy is true, I figured PV comes to 1:1 comparison with electrical generation, but all those differences don't make up even one order of magnitude so I let it be.
The idea, clearly, is that AI training tasks let you beam the value back to Earth (ie the finished trained model) without beaming the BTUs back to Earth (and vaporizing the planet, in the limiting growth case).
Quote from: Twark_Main on 12/01/2025 06:19 pmThe idea, clearly, is that AI training tasks let you beam the value back to Earth (ie the finished trained model) without beaming the BTUs back to Earth (and vaporizing the planet, in the limiting growth case).The problem is that training is only 10-20% of the compute load. Inference latencies can be non-trivial, but I'd guess that a half second RTT is about all you can handle if you want to stay competitive. So you might be able to use some kind of subsync orbit, well away from both LEO and those pesky protons.
That sort of depends. I think some may end up being served directly from “regular” orbit Starlinks not on the terminator. That could reduce latency to like sub 10 millisecond.Note I was looking at SOTA models and they tend to have latencies from 0.3 to 4s to first token. It’s not necessarily a problem to put them in like MEO or GSO, or even out to lunar orbit.
Quote from: Robotbeat on 12/18/2025 10:50 pmThat sort of depends. I think some may end up being served directly from “regular” orbit Starlinks not on the terminator. That could reduce latency to like sub 10 millisecond.Note I was looking at SOTA models and they tend to have latencies from 0.3 to 4s to first token. It’s not necessarily a problem to put them in like MEO or GSO, or even out to lunar orbit.How are you serving a customer from directly overhead? unless every satellite has all the data needed, this does not work.
Related: https://x.com/StockSavvyShay/status/2001391832241181103QuoteElon Musk told xAI staff that winning the AI race comes down to surviving the next 2–3 years & scaling compute, power & capital faster than anyone else.Key points from the meeting:• xAI could achieve AGI as early as 2026• xAI has access to ~$30B per year in funding• Grok 5 carries ~10% probability of achieving AGI• Optimus humanoid robots could eventually operate data centers• Plans to scale from ~200k GPUs to ~1M GPUs across its Colossus data centers• Long-term concepts include data centers in space & Mars-related infrastructure
Elon Musk told xAI staff that winning the AI race comes down to surviving the next 2–3 years & scaling compute, power & capital faster than anyone else.Key points from the meeting:• xAI could achieve AGI as early as 2026• xAI has access to ~$30B per year in funding• Grok 5 carries ~10% probability of achieving AGI• Optimus humanoid robots could eventually operate data centers• Plans to scale from ~200k GPUs to ~1M GPUs across its Colossus data centers• Long-term concepts include data centers in space & Mars-related infrastructure
Eh. I don't really agree, but we'll have to see what happens ... And it might be a couple decades before the answer is clear. What I'm more concerned about in the short term is an investment bubble burst bringing down space stuff along with everything else. This could happen *regardless* of the soundness of the underlying technology - the dot com bubble of the late 90s wrecked private space projects then, though both the Internet in general and satellite internet specifically ultimately did work out.
Depends on the model. Training can be 40% or more of total energy use, particularly if models become more sophisticated. And quite a bit of AI usage could tolerant seconds of latency, such as deep research or video/image generation. Short, low-latency prompts don’t use much energy. Really long prompts or extensive code generation do.I think short prompts will continue to be served very near point of use. But long research, “thinking” models, ensembles of multiple models, very long input and output, not to mention video and image generation already can take minutes to produce results and use tremendous energy to do so, and those can be offloaded to beyond LEO.
...I mean LLMs caught me off guard and I'm super impressed, but it's also clear there's no I in that AI. So... GAI around the corner? I dunno.
...So for me, I don't blindly accept what any CEO says, because they have vested interests in specific outcomes, and they will use the power of their position to influence their potential customers and their competitors.
Quote from: meekGee on 12/19/2025 04:30 am...I mean LLMs caught me off guard and I'm super impressed, but it's also clear there's no I in that AI. So... GAI around the corner? I dunno.There is an NSF thread called "What Billionaire Tech CEOs Get Wrong About The Future, with Adam Becker", and in a post I made I said:Quote from: Coastal Ron on 12/10/2025 02:27 am...So for me, I don't blindly accept what any CEO says, because they have vested interests in specific outcomes, and they will use the power of their position to influence their potential customers and their competitors.Elon Musk has a vested interest in everyone being excited about xAI, Optimus, Tesla FSD, and now SpaceX data centers in space. With SpaceX in the past, there was no direct financial connection between SpaceX success with reusability and me, but with AI there is because Elon Musk wants to take SpaceX public, and he is relying on the public to help not only fund the company, but increase his financial wealth.So when someone has vested interests in specific outcomes, I don't tend to 100% believe them. And since nobody has been able to come up with a definition of AGI (Artificial General Intelligence) that can be measured, Musk, Altman, and everyone else can promise whatever they want.
Quote from: thespacecow on 12/18/2025 09:24 amRelated: https://x.com/StockSavvyShay/status/2001391832241181103QuoteElon Musk told xAI staff that winning the AI race comes down to surviving the next 2–3 years & scaling compute, power & capital faster than anyone else.Key points from the meeting:• xAI could achieve AGI as early as 2026• xAI has access to ~$30B per year in funding• Grok 5 carries ~10% probability of achieving AGI• Optimus humanoid robots could eventually operate data centers• Plans to scale from ~200k GPUs to ~1M GPUs across its Colossus data centers• Long-term concepts include data centers in space & Mars-related infrastructureThis is when I find myself a little bit on Vultur's side of the argument...Musk is probably the greatest industrialist that ever lived, but when it comes to having a handle on future computer tech, sometimes I just don't know ...Like some 10 years ago, when he said his engineers are using motion capture to do CAD like Tony Stark does, or Tesla's ever imminent self-driving capabilities.I mean LLMs caught me off guard and I'm super impressed, but it's also clear there's no I in that AI. So... GAI around the corner? I dunno.