sadarax has asked for the wisdom of the Perl Monks concerning the following question:
The Design: There is the main server (with the website), and volunteers offer their own computers and bandwidth to act mirrors of the main website. End users ask the server for a file. The server tells the volunteer computers to fulfill the request. The user receives their file and never knows the difference. Note: the volunteer computers would not have the same URL or IP addresses as the main website.
Diagram of the general process
Is it possible to make a system like this? Especially, is it possible so the website end users never know the difference, but the server is successfully sharing the file-uploads with the volunteers?
If it is possible, I would like some resources and information about getting starting with writing this system.
Using Bittorrent is not an option for this situation, both in principle and practice. If you want to know read below.
Of course, any other information you good Monks feel is useful for me to know about server would also be appreciated.
Some other programmers suggested a system like this might be the most possible. When the user goes to www.website.com, the server opens an 'empty' page with a script running on it that pulls the data from the volunteer computers and renders those files on the page for the user as normal. Not dynamically generated pages, just accepting the files sent from the volunteer computers (similar to hotlink systems).
Diagram of the frame-script general process
My idea in principle is very similar to bittorrent, but bittorrent itself is not at all suitable for use with a normal website.
Reasons why Bittorrent is not a practical idea:
1) Torrents usually need to be completely downloaded or a large chunk having been completed before any of the data is readable. This would make browsing a website a long and arduous task, and it would make casual browsing completely impossible. But worse it would require several times more bandwidth to complete the torrent. In the end the user would be given many files they might not be interested in at a significant cost to the website in bandwidth, ultimately making the upload bandwidth problem much worse.
2) Torrent files are very static, and the only way to update them is by creating another torrent file. This would not do for a website that updates a little bit each day. It would require the user to download and run a new torrent file each time.
3) To use a torrent, the user use a torrent program, not a webbrowser. This already means that there is more work for them just to view the site casually, which is not good for popularity. My system would be integrated relatively seamlessly on the server side and the end-user would not need to behave any different.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Bandwidth upload sharing
by Corion (Patriarch) on Oct 13, 2007 at 14:00 UTC | |
|
Re: Bandwidth upload sharing
by clinton (Priest) on Oct 13, 2007 at 15:24 UTC | |
|
Re: Bandwidth upload sharing
by NetWallah (Canon) on Oct 13, 2007 at 05:24 UTC | |
by sadarax (Sexton) on Oct 13, 2007 at 09:36 UTC | |
|
Re: Bandwidth upload sharing
by NetWallah (Canon) on Oct 14, 2007 at 04:53 UTC | |
|
Re: Bandwidth upload sharing
by sadarax (Sexton) on Oct 14, 2007 at 10:30 UTC | |
by NetWallah (Canon) on Oct 15, 2007 at 04:03 UTC |