Be patient people, rockets are hard.
Without much work a server could contain 10TB of static internet data. This could include all of the video commonly watched on sites like youtube and most of the popular websites data to that point and lifetimes of educational material including video.
For a sense of scale, a couple years ago I started the Internet-in-a-Box project to deploy content to schools in undeveloped countries without inexpensive internet access.
Cute little problem. Conceptually most content on the web could be cached on the Mars side, eg youtube, wikipedia, new sites and so on, but modern content is not written that way and you can't ask every single website designer to cater to such a small population.I wonder if the solution is closely related to web archiving:http://en.wikipedia.org/wiki/Web_archivingAfter all, you could view the Mars colonists as "future researchers". They happen to only be 40 minutes in the future.
Will need to be error-corrected/resistant to radiation on the the trip/stay there.Will double at least the storage costs...
Quote from: scamanarchy on 06/14/2014 03:35 amWithout much work a server could contain 10TB of static internet data. This could include all of the video commonly watched on sites like youtube and most of the popular websites data to that point and lifetimes of educational material including video.OT, but - 10TB? Try 10PB. Those of us who have HTPCs know how little 10TB buys you. "All of the commonly watched video on sites like Youtube" is a very large set indeed - in 2012 uploads amounted to an estimated 76PB/yr. For PDF books, 10TB is a reasonably good number, catching a sizable subset of the existing jailbroken textbook corpus. For Internet text? Wholly insufficient. The Internet Archive passed 10PB scale two years ago.
Quote from: KelvinZero on 06/13/2014 12:41 pmI wonder if the solution is closely related to web archiving:http://en.wikipedia.org/wiki/Web_archivingA heck of a lot of web content is already geographically cached for lower latency access over cheaper links.
I wonder if the solution is closely related to web archiving:http://en.wikipedia.org/wiki/Web_archiving
I was just thinking of those webpages that don't finish loading until something has replied with a few bytes of information from your browser etc. I expect caching would mainly reduce bandwidth but not interfere with attempts to communicate to the server.. Otherwise you are stuck with viewing mars-friendly sites.
Doesn't have to be space-rated. People launch arduinos and cellphones into space and /operate/ them (need a good watchdog circuit). Just bury it under a few feet of soil when you get there, and you'll be good.