RockyMtn has asked for the wisdom of the Perl Monks concerning the following question:
We found a CGI script that called 'warn' too many times would hang the Perl process running under Apache (separate process). When you kill the process, the Apache error log then gets the first 648 calls to warn, but no more. This does not happen from the command line, nor running in the Eclipse CGI debugger (which doesn't use Apache).
The same thing happens if we merely print to STDOUT 1000 times; it hangs on 649 (based on browser stdout). 100 bytes per call * 648 = 64,800 bytes. (100 more bytes shouldn't be over 64*1024=65536 yet.)
Any ideas please?
env details:
You can try this with:
Any help is appreciated.#!C:\perl\bin\perl.exe use strict; select( STDERR ); $| = 1; select( STDOUT ); $| = 1; print <<HTML_HEADER; Content-type: text/plain <HTML> <head> <title>testWarn.pl</title> </head> <body style='background-color:#cccccc'> HTML_HEADER # Write to stdout lot of times... for ( my $lineNumber = 1; $lineNumber <= 1000; $lineNumber++ ) { print "$lineNumber...<br>\n"; my $errMsg = substr( "Line $lineNumber....." x 10, 0, 99) . "\n"; print STDERR $errMsg; #called 649, writes 648x successfully; + 648*100=64,800; 64*1024=65536 #warn( $errMsg); #called 649, write 648x successfully } print <<HTML_TRAILER; </body> </HTML> HTML_TRAILER
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: output to STDERR/warn hangs Perl under Apache at ~64,800 bytes
by Anonymous Monk on Jul 17, 2014 at 03:13 UTC | |
|
Re: output to STDERR/warn hangs Perl under Apache at ~64,800 bytes
by cord-bin (Friar) on Jul 17, 2014 at 13:32 UTC | |
by RockyMtn (Novice) on Jul 17, 2014 at 17:28 UTC |