jaiieq has asked for the wisdom of the Perl Monks concerning the following question:
I have a script that runs an external command. If the command fails to respond within 20 seconds, I kill it off and proceed to the rest of the script. However, when I have to "kill" the process, it is leaving a zombie child process.
Since this command can be run hundreds to thousands of times in a given period, the zombie processes add up and eventually the system will return with a "Resources not available, unable to fork," and eventually the script will die.
Here is the code in question (which I found using sites like this). Is there a better way to do this, or is my code just screwy.
eval { local $SIG{ALRM} = sub { die "alarm\n" }; alarm 20; $CHILD = fork(); if( $CHILD == 0 ) { exec( "my external command" ); } waitpid( $CHILD, 0 ); alarm 0; }; if( $@ ) { die unless $@ eq "alarm\n"; kill 9, $CHILD }
What I need is a way to execute a external command via perl. Have it timout after a certain period if it fails to respond, and then kill the child process if it hits the timeout period.
This is on OS X 10.6 with the latest version of Perl
EDIT: By just simply adding another waitpid after the kill command (as suggested), the zombies are gone. Not sure why I never tried that before! Thanks for the help!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: exec creating zombie processes
by ikegami (Patriarch) on Feb 14, 2011 at 19:08 UTC | |
|
Re: exec creating zombie processes
by ELISHEVA (Prior) on Feb 14, 2011 at 19:35 UTC | |
by jaiieq (Novice) on Feb 14, 2011 at 19:42 UTC | |
|
Re: exec creating zombie processes
by Perlbotics (Archbishop) on Feb 14, 2011 at 19:06 UTC | |
|
Re: exec creating zombie processes
by JavaFan (Canon) on Feb 14, 2011 at 20:50 UTC | |
by ikegami (Patriarch) on Feb 14, 2011 at 22:05 UTC | |
|
Re: exec creating zombie processes
by locked_user sundialsvc4 (Abbot) on Feb 14, 2011 at 19:08 UTC |