I always use
Wget for downloading big files, mostly because if a connection fails or times out, it will automatically try to reconnect and continue where it left off. It also can pick up and resume on partially downloaded files. Its very reliable, and I've yet to see it fail to get a big file. You can easily use it from a perl script with system or exec. It also will automatically store the file in a directory named after the server.....saving you the hassle of specifying a file location.
One caveat, wget must be built with openssl support to get from a secure server, but such builds are standard nowadays.