I'm searching for an ideal method for transferring large amounts of data from one unix server to another.
I'm thinking of using an implementation that uses a "server" to feed groups of files to several "clients". The clients would transfer their file(s) (ftp, scp, whatever) and then receive another chunk of files to process, until all files have been transferred.
I've googled and searched through PM for clues as to how best to implement this using perl. It seems that the best options are threads and POE.
Anyone blazed this trail already? Anyone have any implementation recommendations? Anyone have any file transfer protocol preferences?