Author Topic: Overclocked orbital data centers as a potential space industry?  (Read 14338 times)

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
Regarding the revenue streams for private development of new space industries, I'm most familiar with these:

 - Imagery of Earth, communications, and GPS
 - People will pay for trips into space, tourism or possibility colonization
 - We can mine heavy metals and bring them back to Earth
 - Energy from solar power beamed back to Earth
 - In-situ resource development for propellent, which developers of the other points will pay for

Feeling a bit underwhelmed by how convincing these are, I've been wondering if there's an overlooked revenue potential.  I want to know what people think of this proposal:

Build low-temperature data centers in space to do cloud computing

Why?  In short, because low-temperature processors have better performance.  By that I mean, they can have faster clock cycles.  Another motivation is that energy can be cheap in space, but this isn't very compelling.  On the other hand, it's possible to expel energy by radiators at a very low temperature, depending on the heat input and size of the radiator of course.  The physical underpinning of this idea is extremely well-established.  We know that processors can go faster at low temperatures, but the fundamental law is that any computational operation requires a minimum energy due to the laws of thermodynamics.  This energy is proportional to the temperature.

https://en.wikipedia.org/wiki/Landauer%27s_principle

http://www.extremetemperatureelectronics.com/tutorial1.html

Because the case for this proposal is firmly rooted in physical principles, it seems that it will become more significant in the future, as long as computations performed remotely continues to have value.  Even more important - progress in single-threaded processors has stalled.  Going to lower temperatures will allow the speeds of single processors to continue to advance.  In research, this has already happened.  The most powerful single thread computations have been done at supercooled temperatures.

Of course we can get these temperatures on Earth, but we can only do so with extra energy input via a thermal cycle.  This doesn't scale well to simultaneously push the gross FLOPS number while at the same time having the maximum possible speed.  That's where space would be necessary.  If you could passively cool a data center in space to an extremely low temperature then you could get a combination of of high speeds while at the same time low cost per FLOP.

If you imagine commodity computing, then there will be a price per calculation, and this price will be higher if the computation is performed faster.  Even if a supercomputer in space can never compete with the cost-per-computation on Earth, it doesn't matter.  It only needs to compete with a supercomputer on Earth that runs at the same speed - and that's a race that space data centers might be able to win 10 years from now.  Additionally, computer science has extremely robust arguments that establish why not all problems can be made parallel efficiently.  So we should expect the demand to remain strong.

The challenges would obviously be immense, which is why I'm posting this on a forum with rocket scientists.  The radiator design would be a nightmare, particularly if in LEO, and that is the most obvious place for it considering the time lag.  The JWST is in a spot where low temperatures are easier to manage, but it's not as good for providing data services.  In LEO, you would need both a sun shield and an Earth shield.  You could need completely new, radical, radiator designs.  Radiation would be an economics deal-breaker, so it would have to be shielded.  That's a problem, and I think it could only be solved by using resources transported from lunar or asteroid resources, considering the scale of shielding needed to reduce noise to a desirable level.  Then the computers themselves would have to be much lighter than the clusters we use today, combined with lower launch costs.

Nonetheless, I think this is a similar scale to what Planetary Resources is looking into doing.  If the asteroid mining business gets off the ground, I think that infrastructure could also be used to develop this (possibly significant) product of "bulk high-speed cloud computing".  This seems obviously important for making the sales pitch for private development of space.  I have not heard anyone else make this argument, so now I'm making it.
« Last Edit: 12/16/2013 03:40 pm by AlanSE »

Offline Jim

  • Night Gator
  • Senior Member
  • *****
  • Posts: 37440
  • Cape Canaveral Spaceport
  • Liked: 21450
  • Likes Given: 428
The transportation, infrastructure and maintenance costs outweigh the temperature advantage.

Offline DMeader

  • Full Member
  • ****
  • Posts: 959
  • Liked: 103
  • Likes Given: 48
Also, for anything beyond LEO, consider the latency issues.

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
Also, for anything beyond LEO, consider the latency issues.

That's a meaningful detail.  For LEO itself, there are still bandwidth issues.  To the best of my understanding, we can achieve pretty good connectivity with things in LEO, but it can't hope to compare to the amount of data you can push through a stationary fiber cable on Earth.  There would be a premium on the amount of data you send to the satellite and that you get back from it.  But this is something I don't know as much about.  We have a large amount of data transmission with space telescopes and satellite communications, but "large" is subjective here.  I don't imagine it can compare to what the communications giants are doing.

Ideally, of course, any primitive versions of this would involve some time and space premium.  For instance, if this sort of thing could be used as a data relay with computations in-between.  That can have value far beyond what the brute computation itself is.  A popular example is financial trading bots.  People have proposed that a floating data center between NYC and London could exploit price differences that no one else has access to.

some about it toward the 13 minute mark in this video:

http://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html

This would obviously apply to satellites as well, and I would expect some early versions to do things like this (maybe they already do this).  However, I doubt there's much money to be made off of such small differences, and the thermodynamic argument can't be significant without some serious scaling up.

Offline Spugpow

  • Member
  • Posts: 22
  • Liked: 3
  • Likes Given: 3
Perhaps another use for banks of computers in space is to host sensitive data/legally dubious websites like wikileaks.

Offline LegendCJS

  • Full Member
  • ****
  • Posts: 575
  • Boston, MA
  • Liked: 7
  • Likes Given: 2
The fundamental assumption of yours that is is easy to cool things off when surrounded by the best insulator people know how to make i.e. vacuum, is seriously flawed.
Remember: if we want this whole space thing to work out we have to optimize for cost!

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
The fundamental assumption of yours that is is easy to cool things off when surrounded by the best insulator people know how to make i.e. vacuum, is seriously flawed.

I do hope you're familiar with (sigma * T^4).  If you'd like, I can write out the Carnot efficiency formula, which is relevant for running a data center on Earth at the same temperature.  With this and other equations, you could in somewhat short order produce a calculator which can compare the cost of running computations on this satellite versus its terrestrial counterpart given your assumptions about the prices for everything.

However, Jim assures me that he's done all of this, and even included realistic figures for the maintenance.  I eagerly await the mathematics that lead to his final inequality that he shared with us.

Offline Jim

  • Night Gator
  • Senior Member
  • *****
  • Posts: 37440
  • Cape Canaveral Spaceport
  • Liked: 21450
  • Likes Given: 428

However, Jim assures me that he's done all of this, and even included realistic figures for the maintenance.  I eagerly await the mathematics that lead to his final inequality that he shared with us.

Not needed.  It blatantly obvious.  The little efficiency gained by the temperature advantages are grossly overshadowed by the logistics.  It doesn't take a rocket scientist to see it.

That is why you haven't heard why anybody has made the argument.

Offline Jim

  • Night Gator
  • Senior Member
  • *****
  • Posts: 37440
  • Cape Canaveral Spaceport
  • Liked: 21450
  • Likes Given: 428
Regarding the revenue streams for private development of new space industries, I'm most familiar with these:

 - Imagery of Earth, communications, and GPS

The part of the list that is and may only remain viable

Offline D_Dom

  • Global Moderator
  • Full Member
  • *****
  • Posts: 655
  • Liked: 481
  • Likes Given: 152
I am glad to see you recognize the challenges are immense. Can you demonstrate a basic understanding of said challenges by providing data supporting your claims?
Avoid quoting wikipedia and show supporting evidence of
"energy can be cheap in space" or
"progress in single threaded processors has stalled" or
"passively cool a data center in space"

"I think it could only be solved by using resources transported from lunar or asteroid resources" is a reasonable statement, maybe we will see that capability exist in my lifetime, I certainly hope so.
Space is not merely a matter of life or death, it is considerably more important than that!

Offline IRobot

  • Full Member
  • ****
  • Posts: 1312
  • Portugal & Germany
  • Liked: 310
  • Likes Given: 272
Even more important - progress in single-threaded processors has stalled.  Going to lower temperatures will allow the speeds of single processors to continue to advance.  In research, this has already happened.  The most powerful single thread computations have been done at supercooled temperatures.
That's why real engineers invented multi thread programming and GPU computing, instead of sending computers to space.

Also although core speed has not increased much in the previous years, power consumption has been severely reduced.

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
Regarding the revenue streams for private development of new space industries, I'm most familiar with these:

 - Imagery of Earth, communications, and GPS

The part of the list that is and may only remain viable

I make no claim that the proposal would be more lucrative than the other items on the list.  The concept was only ever intended to only be interesting to people who are interested in those other points.


However, Jim assures me that he's done all of this, and even included realistic figures for the maintenance.  I eagerly await the mathematics that lead to his final inequality that he shared with us.

Not needed.  It blatantly obvious.  The little efficiency gained by the temperature advantages are grossly overshadowed by the logistics.  It doesn't take a rocket scientist to see it.

That is why you haven't heard why anybody has made the argument.

Will you clarify what you mean by "efficiency" in this context?  Some possibilities are:

 - Thermodynamic efficiency
 - The energy required per computation
 - The energy required per computation at a given temperature
 - Economic efficiency

I'm getting somewhat tired of being the only one here making references to actual physical laws and units.

I am glad to see you recognize the challenges are immense. Can you demonstrate a basic understanding of said challenges by providing data supporting your claims?
Avoid quoting wikipedia and show supporting evidence of
"energy can be cheap in space" or
"progress in single threaded processors has stalled" or
"passively cool a data center in space"

Processor speed has leveled off.  That was my intention with what I wrote, and I thought I provided elaboration on that point, but I'm always happy to give more clarification.  The phenomenon of leveling off of processor speeds is well documented.

http://www.gotw.ca/images/CPU.png

By "passive", I mean that it is not cooled by a thermal (cryogenic) cycle.  This would be the case if you demanded to run something on Earth at very low temperatures.  The JWST, for instance, will be about 50 degrees Kelvin.  Those temperature are very commonly achieved in labs, obviously, since liquid Hydrogen is even lower temperature.  You just don't get it passively, you put energy into a thermal cycle to sustain that temperature.  Any heat production (which computation will cause) has to be removed by that heat cycle, and you're penalized by the Coefficient of Performance (COP) ratio.  The lower temperature you go to, the higher that ratio is.  That's the case for Earth.  In space, for passive heat removal, the obvious physical constraint is a balance between the heat production, the radiator area, and the temperature.

I'm not arguing about the cost of delivering energy in space.  Even when I said that in my post, I said it wasn't compelling.  Advocates of space-based solar power transmitted via microwave would obviously maintain the position that it is cost-efficient.  My proposal, on the other hand, doesn't even directly require it.  There are several multipliers that would allow such a data center to make more money per the amount of energy it uses, compared to its ground-based counterpart.

Even more important - progress in single-threaded processors has stalled.  Going to lower temperatures will allow the speeds of single processors to continue to advance.  In research, this has already happened.  The most powerful single thread computations have been done at supercooled temperatures.
That's why real engineers invented multi thread programming and GPU computing, instead of sending computers to space.

Also although core speed has not increased much in the previous years, power consumption has been severely reduced.

There are two concepts here: clock speed and energy consumption.  Lower energy consumption is trivially better.  Faster clock speed is also desirable in a way that you may not have appreciated.  In computer science, Amdahl's law is a rule for quantifying the speedup you get from using multiple processors as opposed to one.  This is less than the number of processors.

https://en.wikipedia.org/wiki/Amdahl%27s_law

It is a very strong theoretical claim in computer science that 1 processor doing 10*N operations is superior to 10 processors each doing N operations.  That means that you can solve more problems with the first than you can with the latter.

Offline Jim

  • Night Gator
  • Senior Member
  • *****
  • Posts: 37440
  • Cape Canaveral Spaceport
  • Liked: 21450
  • Likes Given: 428

I'm getting somewhat tired of being the only one here making references to actual physical laws and units.


Not my fault that you are only making references and not hard data supporting your claim nor is our fault that you are proposing a complex solution to a non existent problem. 

It is very easy to see the non viability.

If low temp computing is such a need or even desired, where are all the systems/installations for those can afford them like the military, NSA, national labs, etc.  They can pay for the cryogenic cooling if it was desired.

We can ignore that for a moment. 

What are the:
Costs to design and build a spacebased low temp computing platform
Costs to launch said platform
costs and logistics to maintain said platform (both the spacecraft portion (propellant and hardware) and payload portion (data storage and CPU's).  This will require serving spacecraft with inherent launches
costs of the comm infrastructure for said platform.  It would need a more robust system than NASA's TDRSS (more spacecraft and ground stations, larger spacecraft, etc)

Hmmmm, wait a minute. Instead of doing all these launches, lets just take the propellants and pressurants and the other cryogens used for the launches and send them to a cryogenic computing center.   That is what is meaning by efficiency.




Offline randomly

  • Full Member
  • ****
  • Posts: 674
  • Liked: 326
  • Likes Given: 182
The concept is flawed because although you can passively achieve very low temperatures in space (eg JWST) this is only at very low energy flows. The only way to practically dissipate heat in space is via radiation and this obeys the Stefan-Boltzmann law that power radiated is proportional to the fourth power of temperature.

You will need to dissipate a great deal of heat which forces your radiators to be massive if you are trying to do this passively.

If you go the active cooling approach it becomes vastly easier to actually do it on earth, it would also be vastly cheaper, especially from a maintenance and upgrade point of view.

Also if there was some economic advantage you would see cryocooled processors in use, at least in niche applications. But you do not.

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
So economic efficiency.  I'm not asking trick questions, and your last comment was productive:

It is very easy to see the non viability.

If low temp computing is such a need or even desired, where are all the systems/installations for those can afford them like the military, NSA, national labs, etc.  They can pay for the cryogenic cooling if it was desired.

The proposal is to provide a commodity.  If any commodity can be delivered to market at the running price, then we should declare it to have solved a problem.  I presume that anyone who buys Platinum (for instance) on the market needed a metal for something they were doing.  In a decade from now, I imagine that cloud computing will be fully a commodity (but you're free to disagree with that assumption as well).  This is the central claim:

"If you could passively cool a data center in space to an extremely low temperature then you could get a combination of high speeds while at the same time low cost per FLOP."

I agree that we should consider this in a specific and comparative sense.  Actually, we're fairly close to converging on the criteria that must be satisfied for the concept to be viable.  All of the costs that are unique to operation in space must, at minimum, be lower than the energy costs of the cryo-cycle that runs a counterpart data center on Earth.

The case for this would be strengthened if, responding to market demand, we started to see many cryogenic data centers built on Earth.  That development would be an obvious indicator that orbiting data centers may be approaching profitability.  Currently, this is not the case.  That could change.  This proposal comes with connected predictions and equations that could be used to evaluate some comparative economics.  In other words, the best kind of proposal - a falsifiable one.

The concept is flawed because although you can passively achieve very low temperatures in space (eg JWST) this is only at very low energy flows. The only way to practically dissipate heat in space is via radiation and this obeys the Stefan-Boltzmann law that power radiated is proportional to the fourth power of temperature.

You will need to dissipate a great deal of heat which forces your radiators to be massive if you are trying to do this passively.

But you've neglected Landauer's principle.  The energy you need to remove per computation decreases as you decrease temperature.  You're using thermal engineering, but you also have to use the thermodynamics of computation.  You have to combine BOTH.  There is no other way for the concept to make sense.  This is what you must add:

E = k T ln 2

This is the energy per computation.  The colder you get, the less energy you have to dissipate through the radiator.
« Last Edit: 12/16/2013 08:33 pm by AlanSE »

Offline IRobot

  • Full Member
  • ****
  • Posts: 1312
  • Portugal & Germany
  • Liked: 310
  • Likes Given: 272
You are also forgetting that servers require a lot of physical maintenance (server farms, not one) and that a LEO environment will produce several processing errors, not to mention material degrading.

Offline Robotbeat

  • Senior Member
  • *****
  • Posts: 39270
  • Minnesota
  • Liked: 25240
  • Likes Given: 12115
I thought of a similar idea, as a way to do something useful with space-based solar power without having to beam the power. But the temperature problem is actually harder in space, since radiating heat is arguably a harder problem than on Earth where you have an atmosphere (or bodies of water) to easily dump heat to).

My idea was to perform latency-tolerant computations using the plentiful solar energy. The biggest problem here is if Moore's Law continues... by the time you've built your spacecraft and launched it and started operating it, a process that takes years, state of the art terrestrial processing power will have become significantly cheaper, meaning your advantage in theoretically lower cost power is lost. Also, it'd be hard to make a big, cheap radiator to dump heat that's roughly at room temperature.

However, if Moore's Law slows down (and especially if heat tolerant chips are cheap) and power becomes the most expensive input to computation by an order of magnitude, it might become worth it... You could put your celestial data center even closer to the Sun to collect solar power even cheaper. But the cost of building the radiator wouldn't improve by getting closer to the Sun (it'd get a bit worse, in fact).

However, it might be possible to run a computer that /requires/ cryogenic temperatures, like some sort of quantum computer or something operating with superconductors. It might be that such computers would still have a lot of waste heat, but rejecting waste heat is REALLY expensive at cryogenic temperatures...

However, there is one place that we've explored a bit that is cryogenic (~90 Kelvin, significantly lower than the critical temperature of some superconductors we've already developed) but actually has better heat rejection characteristics than the Earth's atmosphere... That place is Titan. Hopefully you can tolerate latencies measured in hours! :D
(But for some supercomputer simulations, that shouldn't be a problem.)
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline Robotbeat

  • Senior Member
  • *****
  • Posts: 39270
  • Minnesota
  • Liked: 25240
  • Likes Given: 12115
...
But you've neglected Landauer's principle.  The energy you need to remove per computation decreases as you decrease temperature.  You're using thermal engineering, but you also have to use the thermodynamics of computation.  You have to combine BOTH.  There is no other way for the concept to make sense.  This is what you must add:

E = k T ln 2

This is the energy per computation.  The colder you get, the less energy you have to dissipate through the radiator.
This is true, however heat dissipated via radiation is proportional to the fourth power of temperature... which definitely beats the simple single power of temperature in your equation at some point. Radiator structure is going to be around the same order of magnitude as your solar array, if you're trying to reject heat at low temperatures. (as usual, the optimum will be somewhere in the middle...)


...on the other hand, if your computations are reversible, you don't actually need to reject any heat... ;)
« Last Edit: 12/16/2013 09:05 pm by Robotbeat »
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline Robotbeat

  • Senior Member
  • *****
  • Posts: 39270
  • Minnesota
  • Liked: 25240
  • Likes Given: 12115
You are also forgetting that servers require a lot of physical maintenance (server farms, not one) and that a LEO environment will produce several processing errors, not to mention material degrading.
We can hand-wave that away. :) Amount of physical maintenance required is a design variable. Engineering choices determine how much maintenance is required. You can build a system that can operate for years (or even decades) with zero physical maintenance, and I've seen such systems marketed even for terrestrial data servers. You just need to do the right systems engineering. You just need enough spares (you can operate entire servers as spares, too... this is partly how Google works).

As far as processing errors and material degradation, well that's also quantifiable and something you can engineer. For processing errors: Parity checks, redundancy, watch-dog timers, inherent radiation resistance, and shielding are all possible ways to address the issue (and this can be an issue even on Earth... nobody sane runs a server without ECC these days, SSDs and HDs already include internal consistency checks, RAIDs are common place and a RAID-like architecture is used even inside an SSD, etc). Material degradation is just a typical satellite engineering constraint, no different from what current commsat providers need to consider.


But again, none of this is terribly relevant until Moore's Law slows way down.
« Last Edit: 12/16/2013 09:14 pm by Robotbeat »
Chris  Whoever loves correction loves knowledge, but he who hates reproof is stupid.

To the maximum extent practicable, the Federal Government shall plan missions to accommodate the space transportation services capabilities of United States commercial providers. US law http://goo.gl/YZYNt0

Offline AlanSE

  • Full Member
  • *
  • Posts: 153
  • N Cp ln(T)
    • Gravity Balloon Space Habitats Inside Asteroids
  • Liked: 54
  • Likes Given: 33
You are also forgetting that servers require a lot of physical maintenance (server farms, not one) and that a LEO environment will produce several processing errors, not to mention material degrading.

I have very much forgotten the maintenance, because I have no relevant industry experience in this, and can not comment on it.

When you refer to the LEO environment, do you mean the radiation or something else?  I don't believe it would ever make sense without significant shielding.  Without a space industry, this would be an unreasonable mass requirement for lifting to orbit unless the computers could be made extremely small.  I guess I can't dismiss that possibility, but even if their size was 0, a literal point, and your shielding was spherical, you couldn't get the same radiation environment on Earth.  For 2 meters of shielding for a point-computer, that's already a 21 ton launch!

Perhaps there are other concerns for operating in a vacuum.

However, I wonder if people have fully appreciated that the computational limit is truly fundamental.  If optical computing became possible, it would still work better at low temperatures.

However, if Moore's Law slows down (and especially if heat tolerant chips are cheap) and power becomes the most expensive input to computation by an order of magnitude, it might become worth it... You could put your celestial data center even closer to the Sun to collect solar power even cheaper. But the cost of building the radiator wouldn't improve by getting closer to the Sun (it'd get a bit worse, in fact).

However, it might be possible to run a computer that /requires/ cryogenic temperatures, like some sort of quantum computer or something operating with superconductors. It might be that such computers would still have a lot of waste heat, but rejecting waste heat is REALLY expensive at cryogenic temperatures...

Two things here:

Moving closer to the sun will likely hurt, not help, on the basics of this proposal.  But if the radiator extends out into the umbra, then it becomes less clear.  Also, you can't "trick" nature by adding a thermal cycle.  Landauer's principle exists because you could use a thermal cycle to lower the temperature of a computer.  If computation was equally efficient at all temperatures, you could build a perpetual motion device.

Quantum computing already requires super cold temperatures.  These temperature are far less than the temperature of the CMB, so even in space it would require a thermal cycle.  That's why I did not propose it.
« Last Edit: 12/16/2013 09:23 pm by AlanSE »

Tags:
 

Advertisement NovaTech
Advertisement Northrop Grumman
Advertisement
Advertisement Margaritaville Beach Resort South Padre Island
Advertisement Brady Kenniston
Advertisement NextSpaceflight
Advertisement Nathan Barker Photography
1