We have apache v.2 and perl 5.8. running on HP-UX. Subsequent calls give 500 Internal Server Error. (everything else on the web server is functional and from the shell, perl scripts still run fine indefinately). The httpd.conf has the following, but changing values does not affect the 30k max requests before CGI becomes unavailable:#!/usr/bin/perl use strict; use warnings; use lib "/web_sites/cgi-modules"; use LWP::Simple; use LWP::UserAgent; use HTTP::Request; use HTTP::Response; use HTML::LinkExtor; my $url = "http://devintranet2/cgi-bin/environment.pl"; my $counter = 0; while ($counter < 60000) { my $browser = LWP::UserAgent->new(); $browser->timeout(10); my $request = HTTP::Request->new(GET => $url); my $response = $browser->request($request); if ($response->is_error()) { #printf "%s\n", $response->status_line; $counter = $counter + 1; print "$counter ["; printf "%s]\n", $response->status_line; die; } my $contents = $response->content(); #print "$contents"; #sleep(0.3); $counter = $counter + 1; print "$counter ["; printf "%s]\n", $response->status_line; }
(I beg forgiveness if this question is more mod_perl / apache related) Thanks everyone.<IfModule worker.c> ServerLimit 16 StartServers 2 MaxClients 150 MinSpareThreads 25 MaxSpareThreads 75 ThreadsPerChild 25 MaxRequestsPerChild 0 </IfModule>
In reply to Apache / 30,000 Perl script serving limit by QcMonk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |