in reply to Automatic Download of Multiple Files with Perl

Personally, i'd just do wget -r -l 1 --wait=$SOME_WAIT_TIME http://archive.godatabase.org/latest-full/

I'm pretty sure you could build something similar fairly easily using WWW::Mechanize.

Replies are listed 'Best First'.
Re^2: Automatic Download of Multiple Files with Perl
by varian (Chaplain) on Apr 16, 2007 at 15:41 UTC
    I second Joost' suggestion to use or embed wget to get an entire directory in one shot.

    Even while I like and use LWP a lot for web site interaction, from personal experience I have seen LWP hang upon download of larger files (>200MB) when deployed over less reliable networks whereas wget has never let me down in such context.
    It's feature to restart downloads (even from the bytecount point where connection got lost) of files upon network failures is a nice one and it can run unattended.

Re^2: Automatic Download of Multiple Files with Perl
by smahesh (Pilgrim) on Apr 17, 2007 at 03:43 UTC

    Joost++.

    It will be less of an hassle to use external programs optimized for downloading like wget or curl instead of trying to use LWP. LWP is a general purpose library to interact with websites and it sometimes croaks when downloading large files. Also, LWP does not have support for downloading entire directories and resuming aborted downloads. All that logic has to be implemented in your code. With wget/curl - you get all these "built-in" and your perl code is a simple wrapper.

    offtopic: In college, we had a thin pipe connection and we could not download research papers without timeouts, etc. I used to create a file with the list of URLs to fetch, run wget-f <filename> in a detached screen session and it would download the entire file overnight. I think you need to do something similar here.

    Mahesh