Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'd like to write a script that will run another script and report its exit code/status. Is this possible without forking or using system(), which also creates a separate process for the final script?

Essentially, what I'm after is the result of scripts that will be scheduled through cron. If there's a way to manage this with the script wrapper I've mentioned, then I could just schedule the wrapper (with the intended script as an argument to it) in the crontab.

Is there a way to go about this, or is there a much easier way to accompmlish what I'm trying to do?

Thanks

Replies are listed 'Best First'.
Re: Script wrapper script
by sgifford (Prior) on Aug 17, 2005 at 20:06 UTC

    It's impossible to do exactly what you describe. The exit status of a program isn't returned until it exits, at which point there would have to be another program waiting around to report what happened.

    I'd be surprised if creating 2 processes for each task is really a problem. The "wrapper" program would just start another script then go to sleep, at which point it would use an extremely small amount of system resources. I frequently start up chains of programs from cron, and have never had problems with this.

    But, you can get close to what you want by using do, as chibiryuu suggests, and overriding exit:

    my $prog = shift; eval { sub exit { die "exit $_[0]\n"; } do '$prog' or die "Couldn't run '$prog': $!\n"; }; print "'$prog' results: $@\n"; my $childexit; if (!$@) { $childexit = 0; } elsif ($@ =~ /^exit (\d+)$/) { $childexit = $1; } else { $childexit = 1; } CORE::exit($childexit);
Re: Script wrapper script
by chibiryuu (Beadle) on Aug 17, 2005 at 18:50 UTC

    See perlfunc#do-EXPR; that sounds like what you want.

    If these are scripts you don't have control over, though, I'd still fork, just to separate them and make sure they can't clobber anything in your script.

      I'll have to try using do, and see if I can retain the exit status of my intended script.

      I'm trying to avoid forking because there will be a LOT of scripts that're going to be launched from cron, and I'd rather not have two processes per script. As I'm sure the SAs would agree.

      In any event, I really don't want to try to replace cron with a custom scheduler system, but we definitely want to track what ran, how it completed, and log it somewhere. This seems like the best approach. Any other ideas?

      Thanks

Re: Script wrapper script
by graff (Chancellor) on Aug 18, 2005 at 04:30 UTC
    If I understand what you're trying to do, you can just use the shell syntax in the cron command line:
    # crontab file: 1 1 1 * * /home/me/bin/monthly_job || echo monthly job failed 2 2 * * 2 /home/me/bin/weekly_job && echo weekly job succeeded
    and so on. Given that the scripts will exit with zero status on success and non-zero status on failure, the conditionals on the command line will work as you would expect: in the "monthly" example, if the first command fails (non-zero exit status), the second command (following "||") is run; in the "weekly" case, the second command (following the ampersands) will only run if the first command succeeds (zero exit status).
Re: Script wrapper script
by zentara (Cardinal) on Aug 18, 2005 at 13:16 UTC
    See if this works for you. I'm not sure it will always capture $? correctly.
    #!/usr/bin/perl use warnings; use strict; use IPC::Open3; #my $cmd = 'ls -la'; my $cmd = 'date'; my $pid = open3(\*WRITER, \*READER, \*ERROR, $cmd); # if \*ERROR is set to 0, stderr will be combined with stdout # set \*WRITER to 0 unless you need to write to stdin while( my $output = <READER> ) { #can send this to files print "output->$output"; } while( my $errout = <ERROR> ) { #can send this to files print "err->$errout"; } waitpid( $pid, 0 ) or die "$!\n"; my $retval = $?; print "pid->$pid\nretval->$retval\n";

    I'm not really a human, but I play one on earth. flash japh