bivansc has asked for the wisdom of the Perl Monks concerning the following question:

I have a customized templating system I wrote to do language translations using mod_perl to filter every page using xml files for the words. It works well, and I even figured out how to make it handle server-side includes such as embedded php.

However, it appears that anytime my mod_perl running as a Apache::Filter handler runs the PHP it has a disturbing tendency to create little PHP zombies that never go away until the http server gets too many of them and then has to be restarted.

Here is the relevant code:

if($content =~ /\<\?php (.*?)\?\>/s) { $scrap = $1 . "\n"; $ENV{REDIRECT_STATUS} = "dummy status"; $ENV{REDIRECT_URL} = $ENV{SCRIPT_URI}; $ENV{REDIRECT_URI} = $ENV{SCRIPT_URI}; $ENV{REDIRECT_QUERY_STRING} = $ENV{QUERY_STRING}; open2(\*PHPOUT,\*PHPIN,"php -q 2>&1") or log_error("Rotary::Transl +ations: Unable to open php pipe. Reason: $!\n"); print PHPIN $scrap; close(PHPIN); my $line; while($line = <PHPOUT>) { if( $line eq "Content-type: text/html\n") { next; } $newcontent .= $line; } close(PHPOUT); $content = ""; }

I do close the pipe, so what more do I need to do to make it release the zombie?

Replies are listed 'Best First'.
Re: Perl Web Zombie problem
by sgifford (Prior) on Oct 16, 2003 at 22:00 UTC

    pg is right, and there's another thing to keep in mind: If you write too much to the PHP program, this whole process can hang.

    Consider this situation. You have a big block of text in your PHP tags. You catch this code in your regexp, and send it to PHP. PHP starts reading the data you've sent it, and starts writing its output to stdout, which you're not reading yet. The OS will buffer this for a while, but when the buffer fills up, PHP's write command will block. At this point, it will cease to read, and your program's write will block. Deadlock! Here's an example of exactly this:

    This won't be a problem if you know the amount of data you have is smaller than the size of a pipe buffer (usually at least 4K). If the data may be bigger, you'll have to do something to process both the input and output streams alternatingly. You can use select for this, or fork off a second process to handle the write and have your current process do the read:

Re: Perl Web Zombie problem
by pg (Canon) on Oct 16, 2003 at 21:16 UTC

    open2 returns the pid of the child process, and you have to kill it or waitpid, depending on your needs.

    Check perlipc and perlfunc (for waitpid)