Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi all,

I have a web script that I would like to prevent from running infinitely - due to unknown bugs for example. Is there a way to make it run for, say 5 mins, and then kill the process whether it's hung or has entered an infinite loop?

I tried the code below but it didn't work:

use Proc::Watchdog; my $w = Proc::Watchdog->new( { -path => '/tmp' } ); # stop process in 5 mins regardless of outcome $w->alarm(5); sub do_while { print "hi\n" while 1; } do_while(); $w->reset;

The watchlog file is created but the "do_while sub" continues to run after 5 secs. Am I missing something here?

Thank you for reading and I look forward to your kind replies.

Replies are listed 'Best First'.
Re: Kill process after X mins
by Corion (Patriarch) on Jul 21, 2011 at 13:05 UTC

    In Proc::Watchdog, I read:

    A separate daemon (watchd) included along with this module, is called from cron or another similar service to check on the path.

    So, does watchd run on your system and does it work? Also, does it have the appropriate permissions to kill the process(es) you run?

    The three simpler approaches would be:

    1. Configure your webserver to kill processes after a certain time
    2. Use alarm to make your program commit suicide
    3. Look at what ulimits your OS supports and maybe limit CPU or wallclock time for your processes

      Can you please let me know how can we start the 'watchd' daemon on Linux machine. -Pradeep

        Have you read the documentation of Proc::Watchdog?

        If you are asking about the included watchd daemon only, use whaterver system service management tool your Linux vendor supports. This is mostly a system administration question and I know little of Linux system administration.

      Many thanks, Corion!

      alarm does exactly what I was looking for. Does it also work if the process hung? In my case, it's a "while" loop and it was killed at the specified time.

      I'm using a shared server and don't have control over (1) and (3).

        alarm is likely a feature supplied by your operating system, so whether alarm works for a "hung" process really depends on your operating system and what you consider "hung".

        On unixish operating systems, SIGALRM will interrupt almost any (system) call. Most recent versions of Perl will likely only process signals after system calls have returned ("safe signals"). If your program gets "hung" in an endless loop, the SIGALRM handler will stop your program. If your program gets "hung" in some system call, it will not necessarily get immediately stopped.

        Personally, I question the wisdom of the person administrating the "shared server" without hard ulimits and hard maximum request limits, but maybe they have reasons for not limiting the resources users can consume.

Re: Kill process after X mins
by locked_user sundialsvc4 (Abbot) on Jul 21, 2011 at 14:54 UTC

    If your web users are capable of initiating a procedure that might run for five minutes, or even much less than that long, then I suggest that you should be using a proper batch-job processing system to do the work ... and that your web interface should be just that, an interface to that system.   This will give you, in one swell foop, both the ability to control long-running or resource intensive work that has “run away,” but also to regulate the workload that is attempted by the system independently of the number of requests.

      Thank you :)

      It's a controlled script used by a small group of members. But yes, would need to eventually migrate it to a batch-job processing system.