Also, for anything beyond LEO, consider the latency issues.
The fundamental assumption of yours that is is easy to cool things off when surrounded by the best insulator people know how to make i.e. vacuum, is seriously flawed.
However, Jim assures me that he's done all of this, and even included realistic figures for the maintenance. I eagerly await the mathematics that lead to his final inequality that he shared with us.
Regarding the revenue streams for private development of new space industries, I'm most familiar with these: - Imagery of Earth, communications, and GPS
Even more important - progress in single-threaded processors has stalled. Going to lower temperatures will allow the speeds of single processors to continue to advance. In research, this has already happened. The most powerful single thread computations have been done at supercooled temperatures.
Quote from: AlanSE on 12/16/2013 03:23 pmRegarding the revenue streams for private development of new space industries, I'm most familiar with these: - Imagery of Earth, communications, and GPSThe part of the list that is and may only remain viable
Quote from: AlanSE on 12/16/2013 05:17 pmHowever, Jim assures me that he's done all of this, and even included realistic figures for the maintenance. I eagerly await the mathematics that lead to his final inequality that he shared with us.Not needed. It blatantly obvious. The little efficiency gained by the temperature advantages are grossly overshadowed by the logistics. It doesn't take a rocket scientist to see it. That is why you haven't heard why anybody has made the argument.
I am glad to see you recognize the challenges are immense. Can you demonstrate a basic understanding of said challenges by providing data supporting your claims? Avoid quoting wikipedia and show supporting evidence of"energy can be cheap in space" or "progress in single threaded processors has stalled" or "passively cool a data center in space"
Quote from: AlanSE on 12/16/2013 03:23 pm Even more important - progress in single-threaded processors has stalled. Going to lower temperatures will allow the speeds of single processors to continue to advance. In research, this has already happened. The most powerful single thread computations have been done at supercooled temperatures.That's why real engineers invented multi thread programming and GPU computing, instead of sending computers to space.Also although core speed has not increased much in the previous years, power consumption has been severely reduced.
I'm getting somewhat tired of being the only one here making references to actual physical laws and units.
It is very easy to see the non viability.If low temp computing is such a need or even desired, where are all the systems/installations for those can afford them like the military, NSA, national labs, etc. They can pay for the cryogenic cooling if it was desired.
The concept is flawed because although you can passively achieve very low temperatures in space (eg JWST) this is only at very low energy flows. The only way to practically dissipate heat in space is via radiation and this obeys the Stefan-Boltzmann law that power radiated is proportional to the fourth power of temperature.You will need to dissipate a great deal of heat which forces your radiators to be massive if you are trying to do this passively.
...But you've neglected Landauer's principle. The energy you need to remove per computation decreases as you decrease temperature. You're using thermal engineering, but you also have to use the thermodynamics of computation. You have to combine BOTH. There is no other way for the concept to make sense. This is what you must add:E = k T ln 2This is the energy per computation. The colder you get, the less energy you have to dissipate through the radiator.
You are also forgetting that servers require a lot of physical maintenance (server farms, not one) and that a LEO environment will produce several processing errors, not to mention material degrading.
However, if Moore's Law slows down (and especially if heat tolerant chips are cheap) and power becomes the most expensive input to computation by an order of magnitude, it might become worth it... You could put your celestial data center even closer to the Sun to collect solar power even cheaper. But the cost of building the radiator wouldn't improve by getting closer to the Sun (it'd get a bit worse, in fact).However, it might be possible to run a computer that /requires/ cryogenic temperatures, like some sort of quantum computer or something operating with superconductors. It might be that such computers would still have a lot of waste heat, but rejecting waste heat is REALLY expensive at cryogenic temperatures...