in reply to Reducing LWP Buffering

Yeah, yeah, I've heard the answer before: use Net::FTP. But that's not what Anonymous really asked. With perl there's almost always more than one way to do it, but I'm curious to see if can be done this way--with LWP. Is this simply a limitation of the LWP module? Is there no way to tell LWP ow to do buffering? FWIW, I ran into a similar situation once while working on an embedded platform that made adding modules very difficult. It's nice to be working on PC platforms again, where adding modules is easy, but I will never again tell someone who for some reason resists using some module to use it anyway. I'll be watching this thread to see if the problem can be solved, methinks....

Replies are listed 'Best First'.
Re: Re: Reducing LWP Buffering
by Corion (Patriarch) on Dec 05, 2003 at 12:59 UTC

    I think that should be possible by simply giving the HTTP::Request a code reference as value for the content key, and that code reference then supplies the content. I don't know why this wasn't mentioned yet, but I'm too lazy to check it myself in the documentation.

    Update: Quoth the LWP::UserAgent documentation:

    $ua->request( $request, $content_cb )

    ...

    The request methods described above; get(), head(), post() and mirror(), will all dispatch the request they build via this method. They are convenience methods that simply hides the creation of the request object for you.

    The $content_file, $content_cb and $read_size_hint all correspond to options described with the get() method above.

    You are allowed to use a CODE reference as content in the request object passed in. The content function should return the content when called. The content can be returned in chunks. The content function will be invoked repeatedly until it return an empty string to signal that there is no more content.

    I know it was there somewhere.

    perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
      I don't know why this wasn't mentioned yet

      Probably the same reason the original seeker didn't look at the docs.

      Having never tried that, I'm wondering what the code might look like. Would you need to do something like:

      open (READER, "$file") or die "Can't read: $!\n"; my $req = HTTP::Request->new('PUT', "$url", undef, \&read_file_in_chun +ks(\*READER)); sub read_file_in_chunks { my ($fh_ref) = @_; my $content; read ($fh_ref, $content, 51200); return $content; }

      Hmmm..actually, I just tried that and it seems only the first 51200 bytes get uploaded. What did I miss?

        You can't pass parameters to a callback, so you have to either pass a closure or do some other magic. I haven't tried anything like this recently, and thus I write simply untested code from the top of my head:

        use strict; use LWP::UserAgent; my $url = 'http://www.example.com/'; my $filename = 'test.file'; open CONTENT, "<", $filename or die "Couldn't open $filename : $!"; binmode CONTENT; my $callback = sub { my $content; my $size = read( CONTENT, $content, 51200 ); $content = "" unless $size; $content; }; my $ua = LWP::UserAgent->new(); $ua->post($url,$callback);

        For anything fancier, I strongly suggest you learn about anonymous code references and closures.

        perl -MHTTP::Daemon -MHTTP::Response -MLWP::Simple -e ' ; # The $d = new HTTP::Daemon and fork and getprint $d->url and exit;#spider ($c = $d->accept())->get_request(); $c->send_response( new #in the HTTP::Response(200,$_,$_,qq(Just another Perl hacker\n))); ' # web
        Did you try the :content_file option?
Re: Re: Reducing LWP Buffering
by Anonymous Monk on Dec 05, 2003 at 13:03 UTC
    but I'm curious to see if can be done this way--with LWP
    Can you see in the documentation?