adamsj has asked for the wisdom of the Perl Monks concerning the following question:
use strict; use LWP; my $url = 'ftp://path/to/a/greatbigfile'; my $agent = LWP::UserAgent->new; my $request = HTTP::Request->new(GET => $url); $agent->proxy('ftp' => 'http://our.proxy.server:8080'); my $response = $agent->request($request); $response->is_success or die "$url: ", $response->message,"\n"; open(MYOUT, ">/home/greatbigfile.") or die "No open? $!"; print MYOUT $response->content; close(MYOUT);
Nothing wrong with the code per se--it works fine on itty bitty files--but it fails like this:
ftp://path/to/a/greatbigfile: Out of memory during "large" request for + 33558528 bytes, total sbrk() is 37230796 bytes
on a file of nearly 40 meg. Is there any way to pull the file in chunks? I don't find it in the documentation.
They laughed at Joan of Arc, but she went right ahead and built it. --Gracie Allen
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: LWP file transfer running out ot memory on a whopping big file
by gav^ (Curate) on Feb 07, 2002 at 13:56 UTC | |
by adamsj (Hermit) on Feb 07, 2002 at 14:17 UTC | |
by belden (Friar) on Feb 07, 2002 at 22:17 UTC |