in reply to Re: Handling lots of PUT data in CGI
in thread Handling lots of PUT data in CGI

Thanks for the welcome and the suggestions as to formulate my question better.

The problem that I'm having seems to be generic, which is why my question was phrased that way. To get into more specifics, an example I'm having issues with would be a PUT request which a huge binary file as the payload. It's a request issued with something like curl, so it's just the PUT, some headers, and then the file itself. The application type in this case would be application/octet-stream, so the data is dumped into the PUTDATA parameter by CGI.pm. The problem seems to be there aren't always enough resources available for Perl to read this into memory. Instead of dealing with that whole chunk of data as a scalar, I would want CGI to save it to disc somewhere and just point me to the file.

Here's a super simple example of what I'm talking about:

my $incoming_request = new CGI; print $incoming_request->param("PUTDATA");

The above will cause Perl to run out of memory if the PUT data is large enough.

Thanks again for your help

Replies are listed 'Best First'.
Re^3: Handling lots of PUT data in CGI
by blue_cowdawg (Monsignor) on May 29, 2013 at 12:57 UTC
        The problem seems to be there aren't always enough resources available for Perl to read this into memory. Instead of dealing with that whole chunk of data as a scalar, I would want CGI to save it to disc somewhere and just point me to the file.

    To me that is a different problem than a Perl issue. Almost sounds systemic to me. If, for example, you are running this code on a system with 4Gb of RAM (pretty common these days) and you are running out of memory then you need to constrain the data to be in smaller chunks.


    Peter L. Berghold -- Unix Professional
    Peter -at- Berghold -dot- Net; AOL IM redcowdawg Yahoo IM: blue_cowdawg