gurusrin has asked for the wisdom of the Perl Monks concerning the following question:

On my systems, the jobs are on the fly. The jobs are nothing but shell scripts. These shell scripts reside on NFS and the NFS is accessible on different operating systems. Any of the system, which has this NFS mounted can execute these shell script. To ensure that resources are not hogged, we allow only few processes to be executed per system (including already running and the one that gets newly executed).

Since, multiple systems have access to same NFS, I have implemented locks using the hard-links.

Right now, the script just picks up the next available script and runs it background using 'qx{}' of Perl. However, we are seeing that sometimes these shell scripts fail to execute.

I need your advise, on how can I can check if the process failed, and if failed then how can I ensure that it is restarted.

Replies are listed 'Best First'.
Re: Running background processes
by FreeBeerReekingMonk (Deacon) on Apr 01, 2016 at 19:49 UTC
    Quote-Like-Operators

    If you need to run with retries why not look at AnyEvent::Timer::Retry or Action::Retry

    If you are forking off a child, that uses qx(), then why not read the basic $? (provided your scripts actually do exit non-zero on errors)

    fbrm@monastery:~/$ mkdir /tmp/x fbrm@monastery:~/Downloads$ perl -e 'print qx(ls /tmp/x); print $?' 0 fbrm@monastery:~/$ perl -e 'print qx(ls /tmp/y); print $?' ls: cannot access /tmp/y: No such file or directory 512 fbrm@monastery:~/$
Re: Running background processes
by GotToBTru (Prior) on Apr 01, 2016 at 13:07 UTC

    Do these scripts provide any feedback as they run or when complete? You can capture their output various ways.

    But God demonstrates His own love toward us, in that while we were yet sinners, Christ died for us. Romans 5:8 (NASB)