Joost++.
It will be less of an hassle to use external programs optimized for downloading like wget or curl instead of trying to use LWP. LWP is a general purpose library to interact with websites and it sometimes croaks when downloading large files. Also, LWP does not have support for downloading entire directories and resuming aborted downloads. All that logic has to be implemented in your code. With wget/curl - you get all these "built-in" and your perl code is a simple wrapper.
offtopic: In college, we had a thin pipe connection and we could not download research papers without timeouts, etc. I used to create a file with the list of URLs to fetch, run wget-f <filename> in a detached screen session and it would download the entire file overnight. I think you need to do something similar here.
Mahesh
In reply to Re^2: Automatic Download of Multiple Files with Perl
by smahesh
in thread Automatic Download of Multiple Files with Perl
by monkfan
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |