Author Topic: Moving The Cloud to orbit  (Read 95966 times)

Offline Asteroza

  • Senior Member
  • *****
  • Posts: 3127
  • Liked: 1211
  • Likes Given: 35
Re: Moving "the cloud" to orbit
« Reply #40 on: 07/08/2025 09:20 am »
Does this imply AI datacenters in orbit, needing to buffer power, ultimately are overpowered and have energy storage? This sounds like the batteries in orbit power station in principle?

So what, AI orbital datacenters would have an ideal side gig as SSO SPS proving dawn/dusk supplemental power to the grid due to the available energy storage?

Offline edzieba

  • Virtual Realist
  • Senior Member
  • *****
  • Posts: 7411
  • United Kingdom
  • Liked: 11382
  • Likes Given: 52
Re: Moving "the cloud" to orbit
« Reply #41 on: 07/08/2025 04:33 pm »
Does this imply AI datacenters in orbit, needing to buffer power, ultimately are overpowered and have energy storage? This sounds like the batteries in orbit power station in principle?

So what, AI orbital datacenters would have an ideal side gig as SSO SPS proving dawn/dusk supplemental power to the grid due to the available energy storage?
The problem is, grid power needs power on demand (and to stop power delivery on demand, too), but using 'excess' power from a bursty AI training datacentre does not deliver that: it delivers effectively massive spikes of power at 'random' (from the grid's perspective).
Or put another way: as the power draw variation of AI training datacentres is already a problem for power grids, then turning that into a power supply problem is just the exact same problem with the sign flipped.

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #42 on: 07/08/2025 09:21 pm »
There is turning out to be a few types of data centers.  For the AI data centers, there are two types:  training and inference.  Data centers devoted to training need buffered power because there are dramatic usage spikes.  Buffering requires batteries, which are very heavy.  …
This isn’t true. In orbit, you either have short, 40 minute shadow periods (way shorter than 12-18 hour times of low solar in Earth) which means very small batteries (as batteries are INCREDIBLY good at high power but are heavy for long duration storage… although better than people think)or are in a sun synchronous or high orbit and don’t need batteries at all(other than very short periods of eclipse perhaps).

Training workloads last for days, weeks, or months at a time. So in fact they’re MORE consistent than inference, which goes based on demand from humans.

That's not the way it works, unfortunately.

https://semianalysis.com/2025/06/25/ai-training-load-fluctuations-at-gigawatt-scale-risk-of-power-grid-blackout/

Quote
Not only is the scale massive, but AI training workloads have a very unique load profile, unexpectedly rising and falling from full load to nearly idle in fractions of a second. . .  The issue caught leading AI labs by surprise. Meta’s LLaMa 3 paper mentions challenges with power fluctuations, and that is “only” a 24,000 H100 Cluster (30MW of IT capacity).
Yes, it DOES work that way. Inverters (both solar and batteries) can respond in fractions of a second. This is NOT true for most of the terrestrial grid, which is why it can be a problem for terrestrial datacenters.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #43 on: 07/08/2025 09:23 pm »
Does this imply AI datacenters in orbit, needing to buffer power, ultimately are overpowered and have energy storage? This sounds like the batteries in orbit power station in principle?

So what, AI orbital datacenters would have an ideal side gig as SSO SPS proving dawn/dusk supplemental power to the grid due to the available energy storage?
The problem is, grid power needs power on demand (and to stop power delivery on demand, too), but using 'excess' power from a bursty AI training datacentre does not deliver that: it delivers effectively massive spikes of power at 'random' (from the grid's perspective).
Or put another way: as the power draw variation of AI training datacentres is already a problem for power grids, then turning that into a power supply problem is just the exact same problem with the sign flipped.
The reason why it’s a problem for TERRESTRIAL grids is because the terrestrial grid is dominated by thermal power plants which respond slowly over minutes or hours. Satellites use solar and batteries with DC-DC converters that can respond in fractions of a second.

This is pretty fundamental.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline Lee Jay

  • Elite Veteran
  • Senior Member
  • *****
  • Posts: 9137
  • Liked: 4282
  • Likes Given: 408
Re: Moving "the cloud" to orbit
« Reply #44 on: 07/08/2025 09:36 pm »
There is turning out to be a few types of data centers.  For the AI data centers, there are two types:  training and inference.  Data centers devoted to training need buffered power because there are dramatic usage spikes.  Buffering requires batteries, which are very heavy.  …
This isn’t true. In orbit, you either have short, 40 minute shadow periods (way shorter than 12-18 hour times of low solar in Earth) which means very small batteries (as batteries are INCREDIBLY good at high power but are heavy for long duration storage… although better than people think)or are in a sun synchronous or high orbit and don’t need batteries at all(other than very short periods of eclipse perhaps).

Training workloads last for days, weeks, or months at a time. So in fact they’re MORE consistent than inference, which goes based on demand from humans.

That's not the way it works, unfortunately.

https://semianalysis.com/2025/06/25/ai-training-load-fluctuations-at-gigawatt-scale-risk-of-power-grid-blackout/

Quote
Not only is the scale massive, but AI training workloads have a very unique load profile, unexpectedly rising and falling from full load to nearly idle in fractions of a second. . .  The issue caught leading AI labs by surprise. Meta’s LLaMa 3 paper mentions challenges with power fluctuations, and that is “only” a 24,000 H100 Cluster (30MW of IT capacity).
Yes, it DOES work that way. Inverters (both solar and batteries) can respond in fractions of a second. This is NOT true for most of the terrestrial grid, which is why it can be a problem for terrestrial datacenters.

If you have a source, yes.  So batteries yes, solar no.  But you're missing the point that these changes can be far, far faster than even inverters can respond.  As I posted above, we *measured* 7kHz variations at the MW scale.  Inverters typically have an open-loop bandwidth that's less than 60Hz and switching frequencies in the 3kHz range (for large ones).  These high-frequency changes cannot be controlled by inverter controls, they have to be managed at the electrical level.

Offline Lee Jay

  • Elite Veteran
  • Senior Member
  • *****
  • Posts: 9137
  • Liked: 4282
  • Likes Given: 408
Re: Moving "the cloud" to orbit
« Reply #45 on: 07/08/2025 09:37 pm »
Does this imply AI datacenters in orbit, needing to buffer power, ultimately are overpowered and have energy storage? This sounds like the batteries in orbit power station in principle?

So what, AI orbital datacenters would have an ideal side gig as SSO SPS proving dawn/dusk supplemental power to the grid due to the available energy storage?
The problem is, grid power needs power on demand (and to stop power delivery on demand, too), but using 'excess' power from a bursty AI training datacentre does not deliver that: it delivers effectively massive spikes of power at 'random' (from the grid's perspective).
Or put another way: as the power draw variation of AI training datacentres is already a problem for power grids, then turning that into a power supply problem is just the exact same problem with the sign flipped.
The reason why it’s a problem for TERRESTRIAL grids is because the terrestrial grid is dominated by thermal power plants which respond slowly over minutes or hours. Satellites use solar and batteries with DC-DC converters that can respond in fractions of a second.

This is pretty fundamental.

Terrestrial grids also have enormous inertia, giving them the ability to naturally respond in fractions of an electrical cycle, even if that response only has the energy to sustain the response for a few seconds or tens of seconds.

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #46 on: 07/08/2025 10:01 pm »
There is turning out to be a few types of data centers.  For the AI data centers, there are two types:  training and inference.  Data centers devoted to training need buffered power because there are dramatic usage spikes.  Buffering requires batteries, which are very heavy.  …
This isn’t true. In orbit, you either have short, 40 minute shadow periods (way shorter than 12-18 hour times of low solar in Earth) which means very small batteries (as batteries are INCREDIBLY good at high power but are heavy for long duration storage… although better than people think)or are in a sun synchronous or high orbit and don’t need batteries at all(other than very short periods of eclipse perhaps).

Training workloads last for days, weeks, or months at a time. So in fact they’re MORE consistent than inference, which goes based on demand from humans.

That's not the way it works, unfortunately.

https://semianalysis.com/2025/06/25/ai-training-load-fluctuations-at-gigawatt-scale-risk-of-power-grid-blackout/

Quote
Not only is the scale massive, but AI training workloads have a very unique load profile, unexpectedly rising and falling from full load to nearly idle in fractions of a second. . .  The issue caught leading AI labs by surprise. Meta’s LLaMa 3 paper mentions challenges with power fluctuations, and that is “only” a 24,000 H100 Cluster (30MW of IT capacity).
Yes, it DOES work that way. Inverters (both solar and batteries) can respond in fractions of a second. This is NOT true for most of the terrestrial grid, which is why it can be a problem for terrestrial datacenters.

If you have a source, yes.  So batteries yes, solar no.  But you're missing the point that these changes can be far, far faster than even inverters can respond.  As I posted above, we *measured* 7kHz variations at the MW scale.  Inverters typically have an open-loop bandwidth that's less than 60Hz and switching frequencies in the 3kHz range (for large ones).  These high-frequency changes cannot be controlled by inverter controls, they have to be managed at the electrical level.
there is no need to respond to changes n DC loads any faster than fractions of a second because you literally just use capacitors, which are in every power supply. Capacitors respond effectively infinitely fast for our purposes.

This is, again, fundamental.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #47 on: 07/08/2025 10:05 pm »
Instead of making up limitations that don’t apply to satellites (and show a misunderstanding of basic electronics), I think people should focus on the real reasons why it doesn’t make sense to put datacenters in space at the moment, such as the fact that a DGX-2 AI server costs about $40/Watt already so all the trouble of going to space for cheaper power is kind of not worth it because a “baseload” equivalent terrestrial energy source costs about $5-10/Watt, so you’re only saving a small fraction of the TCOE.

Potentially a different story if the upfront server cost per demand watt is like a few dollars per watt. But we aren’t there yet for AI servers. Cyrptocurrency (heh, spelling it correctly gets the forum to autocorrect it to “scamcurrency”… fair play tbh) mining, maybe (this is comparable to ASIC cost and power consumption), but that has a bunch of problems associated with it.

And in general, all of these things kind of depend on starship or something like it succeeding, with costs getting down to $100 per kg or so. At the moment, starship isn’t doing too good. So this remains few years in the future at best except for niche uses.
« Last Edit: 07/08/2025 10:09 pm by Robotbeat »
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3367
  • Liked: 2943
  • Likes Given: 12156
Re: Moving "the cloud" to orbit
« Reply #48 on: 07/08/2025 10:09 pm »
Yes, it DOES work that way. Inverters (both solar and batteries) can respond in fractions of a second. This is NOT true for most of the terrestrial grid, which is why it can be a problem for terrestrial datacenters.

Fundamentally, training jobs are much, much less consistent power users than inference jobs.  This is because training is a synchronized process where each step has a different power use profile.

As for whether the solar power production equipment can keep up without long-term performance degradation, that's outside my knowledge.  But I would be interested in learning more if you can explain.  For sure, batteries need to be big so that there are fewer cycles and minimal long-term battery degradation.  And if the batteries need to be big, they are quite massive.  For example, a 2 MW/4 MWh Megapack XL is some 38 tons.  My main objection to orbital cloud AI training was that terrestrially we just throw mass at the problem while in orbit it's more difficult to do so.
« Last Edit: 07/08/2025 10:14 pm by RedLineTrain »

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #49 on: 07/08/2025 10:13 pm »
Batteries are around 200-300Wh/kg for low end ones for your car. For space, you can use newer ones at 400-500Wh/kg but they cost more.

You only need them to last 40 minutes or so. 300Wh/kg therefore corresponds to 450W/kg. Solar arrays are like 100-150W/kg, so this is much smaller than the array mass. And not needed for sun synch or high orbit.

Megapack is obviously a dumb comparison as it’s not at all weight optimized.

A DGX-2 AI server costs $400k and weighs 160kg and uses 10kW, or 62.5W/kg. So basically battery mass is almost irrelevant in comparison to everything else. But it can function as shielding.

That 2MW powerpack can power 200 DGX-2 servers worth $80M. 38 tonnes, if starship gets costs to $10-100/kg, is only $400k-$4M, again practically irrelevant to the cost of the server hardware.

Full reuse makes it possible to just throw mass at the problem if you really want to. A lot of people just still think about things from the gold plated NASAA or spy satellite probe. And not from the perspective of space launch about as cheap as airfreight.

« Last Edit: 07/08/2025 10:18 pm by Robotbeat »
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3367
  • Liked: 2943
  • Likes Given: 12156
Re: Moving "the cloud" to orbit
« Reply #50 on: 07/08/2025 10:16 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #51 on: 07/08/2025 10:19 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.
it is not. I cross edited you again. Even with your mega pack, starship lowers battery mass launch cost to less than one percent of server costs.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #52 on: 07/08/2025 10:23 pm »
Megapack (which is made out of just mild steel, no attempt whatsoever at even mild weight reduction like cars) has a purchase cost of about $1-1.5M. If the mass is 25-38t, then if launch cost gets down to Starship’s $10/kg, that’s just a $0.25-0.38M launch cost. In other words, logistics becomes less than equipment purchase price even for almost the least optimized materials
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3367
  • Liked: 2943
  • Likes Given: 12156
Re: Moving "the cloud" to orbit
« Reply #53 on: 07/08/2025 10:44 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.
it is not. I cross edited you again. Even with your mega pack, starship lowers battery mass launch cost to less than one percent of server costs.

I think that you're too sanguine on the energy density at pack level by maybe a factor of 2-4.  Megapack XL is at about 100 Wh/kg.

Quote
Full reuse makes it possible to just throw mass at the problem if you really want to. A lot of people just still think about things from the gold plated NASAA or spy satellite probe. And not from the perspective of space launch about as cheap as airfreight.

Doesn't have much to do with cost, which I can readily concede.  Rather, it's a bottleneck in the near term for AI training jobs.  Again, these superclusters are their own beast in that they need to be tightly integrated.  Table stakes this year on the frontier is maybe 100-MW (Grok 4 is 200-MW).  End of this year into next year is 1-GW.  2027 maybe 5-GW?  Starship cannot yet hang with that growth curve.

Of course, AI inference jobs are a completely different matter.  And you may well guess that in the very long run, AI training jobs will outgrow terrestrial capabilities.  Culture ships.
« Last Edit: 07/08/2025 10:49 pm by RedLineTrain »

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #54 on: 07/08/2025 10:51 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.
it is not. I cross edited you again. Even with your mega pack, starship lowers battery mass launch cost to less than one percent of server costs.

I think that you're too sanguine on the energy density at pack level by maybe a factor of 2-4.  Megapack XL is at about 100 Wh/kg.

SpaceX ALREADY uses lithium-ion batteries on Starlink, dragon, Falcon, and starship, and you can look up the performance. 160-200Wh/kg.

Sandbagging the numbers doesn’t make sense. SpaceX is likely to use automotive weight packs or better. You likely would for an orbital datacenter as well.
« Last Edit: 07/08/2025 10:53 pm by Robotbeat »
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline RedLineTrain

  • Senior Member
  • *****
  • Posts: 3367
  • Liked: 2943
  • Likes Given: 12156
Re: Moving "the cloud" to orbit
« Reply #55 on: 07/08/2025 10:56 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.
it is not. I cross edited you again. Even with your mega pack, starship lowers battery mass launch cost to less than one percent of server costs.

I think that you're too sanguine on the energy density at pack level by maybe a factor of 2-4.  Megapack XL is at about 100 Wh/kg.

SpaceX ALREADY uses lithium-ion batteries on Starlink, dragon, Falcon, and starship, and you can look up the performance. 160-200Wh/kg.

Sandbagging the numbers doesn’t make sense. SpaceX is likely to use automotive weight packs or better. You likely would for an orbital datacenter as well.

I have no problems accepting that range (160-200 Wh/kg).

Online Robotbeat

  • Senior Member
  • *****
  • Posts: 41098
  • Minnesota
  • Liked: 27120
  • Likes Given: 12779
Re: Moving "the cloud" to orbit
« Reply #56 on: 07/08/2025 11:05 pm »
Much better than that is achievable, but the cost increases by like a factor of 10 or more if you want to get 400 to 500 W hours per kilogram. So I agree that using automotive battery packs is more of a reasonable comparison if you’re talking about something that is cost sensitive like this. But to extend cycle life, you would only want to use the middle 50%. But good packs could still probably achieve 250-300Wh/kg with acceptable costs.
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline Lee Jay

  • Elite Veteran
  • Senior Member
  • *****
  • Posts: 9137
  • Liked: 4282
  • Likes Given: 408
Re: Moving "the cloud" to orbit
« Reply #57 on: 07/08/2025 11:20 pm »
There is turning out to be a few types of data centers.  For the AI data centers, there are two types:  training and inference.  Data centers devoted to training need buffered power because there are dramatic usage spikes.  Buffering requires batteries, which are very heavy.  …
This isn’t true. In orbit, you either have short, 40 minute shadow periods (way shorter than 12-18 hour times of low solar in Earth) which means very small batteries (as batteries are INCREDIBLY good at high power but are heavy for long duration storage… although better than people think)or are in a sun synchronous or high orbit and don’t need batteries at all(other than very short periods of eclipse perhaps).

Training workloads last for days, weeks, or months at a time. So in fact they’re MORE consistent than inference, which goes based on demand from humans.

That's not the way it works, unfortunately.

https://semianalysis.com/2025/06/25/ai-training-load-fluctuations-at-gigawatt-scale-risk-of-power-grid-blackout/

Quote
Not only is the scale massive, but AI training workloads have a very unique load profile, unexpectedly rising and falling from full load to nearly idle in fractions of a second. . .  The issue caught leading AI labs by surprise. Meta’s LLaMa 3 paper mentions challenges with power fluctuations, and that is “only” a 24,000 H100 Cluster (30MW of IT capacity).
Yes, it DOES work that way. Inverters (both solar and batteries) can respond in fractions of a second. This is NOT true for most of the terrestrial grid, which is why it can be a problem for terrestrial datacenters.

If you have a source, yes.  So batteries yes, solar no.  But you're missing the point that these changes can be far, far faster than even inverters can respond.  As I posted above, we *measured* 7kHz variations at the MW scale.  Inverters typically have an open-loop bandwidth that's less than 60Hz and switching frequencies in the 3kHz range (for large ones).  These high-frequency changes cannot be controlled by inverter controls, they have to be managed at the electrical level.
there is no need to respond to changes n DC loads any faster than fractions of a second because you literally just use capacitors, which are in every power supply. Capacitors respond effectively infinitely fast for our purposes.

This is, again, fundamental.

The measurements I was talking about were from actual server loads with actual (tens of thousands) of power supplies in them.  The 7kHz changes were on the grid-side of the power supplies.

Offline Lee Jay

  • Elite Veteran
  • Senior Member
  • *****
  • Posts: 9137
  • Liked: 4282
  • Likes Given: 408
Re: Moving "the cloud" to orbit
« Reply #58 on: 07/08/2025 11:22 pm »
Batteries are around 200-300Wh/kg for low end ones for your car. For space, you can use newer ones at 400-500Wh/kg but they cost more.

ISS uses about 10-20% of the total storage capacity to increase cycle life from 500 cycles to 100,000 cycles, since you get 16 cycles a day.  So multiply your weights by 10 or divide your energy density by 10.

Offline Lee Jay

  • Elite Veteran
  • Senior Member
  • *****
  • Posts: 9137
  • Liked: 4282
  • Likes Given: 408
Re: Moving "the cloud" to orbit
« Reply #59 on: 07/08/2025 11:22 pm »
Megapack is obviously a dumb comparison as it’s not at all weight optimized.

I cross-edited you.  As stated, one of my objections was that on Earth we are able to throw mass at the problem while in orbit we are less able to do so.  That holds even with Starship.
it is not. I cross edited you again. Even with your mega pack, starship lowers battery mass launch cost to less than one percent of server costs.

I think that you're too sanguine on the energy density at pack level by maybe a factor of 2-4.  Megapack XL is at about 100 Wh/kg.

SpaceX ALREADY uses lithium-ion batteries on Starlink, dragon, Falcon, and starship, and you can look up the performance. 160-200Wh/kg.

Sandbagging the numbers doesn’t make sense. SpaceX is likely to use automotive weight packs or better. You likely would for an orbital datacenter as well.

I have no problems accepting that range (160-200 Wh/kg).

It's more like 25Wh/kg because of the need to preserve cycle life.

Tags:
 

Advertisement NovaTech
Advertisement
Advertisement Margaritaville Beach Resort South Padre Island
Advertisement Brady Kenniston
Advertisement NextSpaceflight
Advertisement Nathan Barker Photography
1