in reply to Re: Apache / 30,000 Perl script serving limit
in thread Apache / 30,000 Perl script serving limit

I have tried setting MaxRequestsPerChild to different values and the outcome is that it takes much longer for the script to make the calls, but it will again start returning error 500 on the 30 thousandth request.

  • Comment on Re^2: Apache / 30,000 Perl script serving limit

Replies are listed 'Best First'.
Re^3: Apache / 30,000 Perl script serving limit
by ig (Vicar) on May 07, 2009 at 18:28 UTC

    So the constraint must be common to all the processes, not within or applied to each process separately. And, as you get the same result with various CGI scripts, it is unlikely that it is the CGI scripts that are consuming the resource. This implies that the server is consuming the resource (either the common server process or the child processes).

    It has been suggested to run strace (I understand tusc is the equivalent on HP-UX) on the server to see what is happening when it fails. An alternative would be to attach a debugger to one of the failing server processes and walk through to the failing system call. It might help to be certain what system call is failing.