Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You could make this transparent to the browser. A forward web-proxy on the mars side that has a local 1TB cache that doesn't expire for months. If a request misses the cache, it is queued to be sent with the next upload transmission. An hour later, the requested URLs (and all related data that a simulated browser session required) appear in the cache.

You could auto-crawl sites via RSS feed, transmitting the contents many links deep. Each colonist could then subscribe to whatever sites interested them, including RSS feeds from sites such as HN. All the crawling would be done by a server on Earth which has a reliable (but delayed) channel to Mars.

The transmission between Earth and Mars would not be TCP/IP - it would be a protocol designed to maximise data recoverability without re-sending packets. It could use techniques such as 2d hamming codes for this (hashes within each packet and across many packets).



That is exactly what I was thinking, some kind of channel that just blasts content to the Mars cache using a list of specific requests as well as filling up the rest of the available bandwidth with a predictive algorithm and/or crawler. Unfortunately, I'm probably just gonna have to be happy if we even land someone on Mars in my lifetime.


This sounds too much like Stallman reading his email.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: