How about just using lynx -source? short and sweet. Or wget, I've used it to download a file and pictures in it, called from a Perl script daily.
Or use a program which can resume downloads, I think Xdownload or some others can do that. Assuming you are on unix. Or what was said above. Anyway here is a script of mine called dailywget which I've been using for a couple years.
#!/usr/local/bin/perl # pick up files from the Internet every day and save them somewhere # with date stamp. @urlstoget = qw ( http://your.urls.here ); $rootdir = "/annex/work/dailywget/dls/"; foreach $u (@urlstoget) { $t = scalar(localtime); # Tue Apr 4 02:59:02 JST 2000 $date = substr($t,20,4) . "." . substr($t, 4,3) . "." . substr($t, 8,2); $date =~ s/\s//g; $filename = "solution_" . $date . ".html"; &geturl($u,$filename); print "Retrieved $filename.\n"; } chdir $rootdir; exit 0; sub geturl { my ($url,$f) = @_; chdir $rootdir; $cmd = "wget -t 5 -O - $url > $f"; print "Executing $cmd\n"; @args = ($cmd); # wget 10 tries, url to stdout #foreach $a(@args) {print "-> $a\n";} system(@args); # you can print several urls on one line }
In reply to Re: Having Web Server Download a File
by mattr
in thread Having Web Server Download a File
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |