Spartacus has asked for the wisdom of the Perl Monks concerning the following question:

I have a script that uses NET::SSLeay to connect to an apache webserver using SSL to download files from the site, based on following links in the pages. The issue is how SSLeay implements grabbing pages/files from sites. As far as I know, the only way to do this is:
($page, $result, %headers) = Net::SSLeay::get_https("aaa.bbb.com", 443, "/data/file.gz", Net::SSLeay::make_headers('Authorization' => 'Basic ' . MIME::Base64::encode("$user:$pass")) ); # print downloaded file into local file open(FILE, "file.gz"); print FILE $page; close FILE;
The problem is downloading large files (hundreds of megs). I get a very friendly Out of memory! error. Are there other ways to use SSLeay to download directly to a file, or to redirect a variable to a file handle of some sort, or use unix piping to get around this?

To stave off inevitable questions of why I'm using SSL, the system must be very secure, and giving unix accounts to use SCP or the like to a large number of users was not considered a good idea by the security folk here.

Replies are listed 'Best First'.
Re: Memory limits with NET::SSLeay
by chipmunk (Parson) on May 17, 2001 at 23:03 UTC
    I took a look at the docs for Net::SSLeay... You're using the high-level API above; unfortunately, I don't see a way to download directly to a file with those methods. However, the module also has a low-level API, which should provide the functionality you need.

    You should be able to use the Net::SSLeay::read() method to read in chunks of the file and write them directly to the output file, rather than getting the entire file at once. The documentation provides more details.