Hello, Wondering if anyone has any ideas on how to best implement a PERL based download manager. It has to be able to download (in parallel ideally) files that are gigabytes in size, resume if any connections are broken and as fast as possible. Something like the cnet download manager by kontiki. FTP with multiple connections might work but it does not allow for resuming broken downloads. I believe the way to go would be to work with packets themselves but I have no idea were to start with this. Please let me know if anyone has any links or book titles to working with packets in the context of a download manager or if there is a better way to create a download manager.
Thanks
Sin Tron