I have a script that runs an external command. If the command fails to respond within 20 seconds, I kill it off and proceed to the rest of the script. However, when I have to "kill" the process, it is leaving a zombie child process.
Since this command can be run hundreds to thousands of times in a given period, the zombie processes add up and eventually the system will return with a "Resources not available, unable to fork," and eventually the script will die.
Here is the code in question (which I found using sites like this). Is there a better way to do this, or is my code just screwy.
eval { local $SIG{ALRM} = sub { die "alarm\n" }; alarm 20; $CHILD = fork(); if( $CHILD == 0 ) { exec( "my external command" ); } waitpid( $CHILD, 0 ); alarm 0; }; if( $@ ) { die unless $@ eq "alarm\n"; kill 9, $CHILD }
What I need is a way to execute a external command via perl. Have it timout after a certain period if it fails to respond, and then kill the child process if it hits the timeout period.
This is on OS X 10.6 with the latest version of Perl
EDIT: By just simply adding another waitpid after the kill command (as suggested), the zombies are gone. Not sure why I never tried that before! Thanks for the help!
In reply to exec creating zombie processes by jaiieq
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |