What if it drops the connection halfway through on a slower link? You will have to cater for that, like resuming a download at a given position, but not all servers allow that.
I'd say it's best to yourself a download manager, like getright, netant, etc, that are specialized in downloading large files efficiently from the Internet. They can do special speed optimization too, like openning multiple connections and download sections of the file simutaneously.
| [reply] |
What I'm trying to do is automate the downloading of the backupfile created from my web site. You need to login, auth done via .htaccess, and then you have access to download the file over http.
I wanted to write a perl script which I could get to run once a day, via cron, to do this.
Your right I could use getright etc but these packages don't run on the command line, and with no interaction to enable them to run as a cron job.
Thanks
Chris
| [reply] |