in reply to Having Web Server Download a File

Net::FTP or unix's wget command or lynx or wwwoffle or...

How about just using lynx -source? short and sweet. Or wget, I've used it to download a file and pictures in it, called from a Perl script daily.

Or use a program which can resume downloads, I think Xdownload or some others can do that. Assuming you are on unix. Or what was said above. Anyway here is a script of mine called dailywget which I've been using for a couple years.

#!/usr/local/bin/perl # pick up files from the Internet every day and save them somewhere # with date stamp. @urlstoget = qw ( http://your.urls.here ); $rootdir = "/annex/work/dailywget/dls/"; foreach $u (@urlstoget) { $t = scalar(localtime); # Tue Apr 4 02:59:02 JST 2000 $date = substr($t,20,4) . "." . substr($t, 4,3) . "." . substr($t, 8,2); $date =~ s/\s//g; $filename = "solution_" . $date . ".html"; &geturl($u,$filename); print "Retrieved $filename.\n"; } chdir $rootdir; exit 0; sub geturl { my ($url,$f) = @_; chdir $rootdir; $cmd = "wget -t 5 -O - $url > $f"; print "Executing $cmd\n"; @args = ($cmd); # wget 10 tries, url to stdout #foreach $a(@args) {print "-> $a\n";} system(@args); # you can print several urls on one line }

Replies are listed 'Best First'.
Re: Re: Having Web Server Download a File
by Juerd (Abbot) on May 16, 2002 at 08:23 UTC

    Forking is expensive. Avoid it if you're on a loaded machine, or expect the computer to become loaded (always expect your site to grow large, at least in code design).
    Mirroring a single file is easy:

    use LWP::Simple; mirror($url, $filename);
    Undoubtedly, there are recursive solutions on CPAN.

    - Yes, I reinvent wheels.
    - Spam: Visit eurotraQ.