You're looking for a Perl solution to a non-Perl problem. Look at BitTorrent for a perfect solution for this problem.
------
We are the carpenters and bricklayers of the Information Age.
Then there are Damian modules.... *sigh* ... that's not about being less-lazy -- that's about being on some really good drugs -- you know, there is no spoon. - flyingmoose
I shouldn't have to say this, but any code, unless otherwise stated, is untested
| [reply] |
Good answer. For a local network, I think I'd use NFS or something similar. That should almost keep you from having to think about the network at all. Locking still seems to be a problem with NFS, though.
| [reply] |
If you are just talking about pushing a bunch o files from one server to many clients take a look at rsync. It will send the files to the remote servers, then as the files update on the master server only send the changed files (even better yet, only the portion of the file that is different). This saves a lot of bandwidth.
| [reply] |
| [reply] |
Doing a scalable, practical P2P file transfer is very non-trivial, even from a purely technical point of view (not touching the ugly politics of it all). The Gnutella people screwed it up for years. Further, most of the scalable P2P apps that are out there have been re-implementing each others' ideas and calling them different things (look particularly at what Freenet calls "CHK"--similar ideas are used in almost every P2P app around, except the simple-stupid ones). So don't add to the mess. As hard as it is, many people have already created good solutions, and you'll do well to find them.
----
send money to your kernel via the boot loader.. This and more wisdom available from Markov Hardburn.
| [reply] |