Thanks for the welcome and the suggestions as to formulate my question better.
The problem that I'm having seems to be generic, which is why my question was phrased that way. To get into more specifics, an example I'm having issues with would be a PUT request which a huge binary file as the payload. It's a request issued with something like curl, so it's just the PUT, some headers, and then the file itself. The application type in this case would be application/octet-stream, so the data is dumped into the PUTDATA parameter by CGI.pm. The problem seems to be there aren't always enough resources available for Perl to read this into memory. Instead of dealing with that whole chunk of data as a scalar, I would want CGI to save it to disc somewhere and just point me to the file.
Here's a super simple example of what I'm talking about:
my $incoming_request = new CGI; print $incoming_request->param("PUTDATA");
The above will cause Perl to run out of memory if the PUT data is large enough.
Thanks again for your help
In reply to Re^2: Handling lots of PUT data in CGI
by rufusisnodufus
in thread Handling lots of PUT data in CGI
by rufusisnodufus
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |