It was AFTER reading that section/page that I came up with my question! So it did NOT answer it, but helped cause question...as I'm not using CGI:pm, for various reasons (which experts here will likely disagree with) that I could not see how it would handle my data in the way that I needed to...so need some info on what methods might work when not using the standard rountines, ok? | [reply] |
CGI implements a $CGI::POST_MAX variable which sets an upper limit on the request size it'll handle. Look at that code and reimplement or copy it into yours.
What it sounds more like you're in search of is being able to configure your web server to disallow requests over a certain size, but if it can be or how that is done is going to vary from httpd to httpd (for example, see LimitRequestBody for older Apachen).
| [reply] [d/l] |
I looked at the POST_MAX code. It simply checks the ENV(CONTENT_LENGTH} against the POST_MAX value and returns a 413 error if larger. The problem with that solution is that the CGI script has ALREADY been passed the unneeded large data block, so handling it at server level seems to be a better approach.
Based on your helpful pointer, I was able to get the host I use to add this directive to the allowed list. I have now added it at Directory level for my CGI folder and it works great to block undesired POST data over my specified size. THANKS much!
| [reply] |