silicon39 has asked for the wisdom of the Perl Monks concerning the following question:

If a Perl CGI script is called from an HTML, but the CGI has a bug which causes an infinite loop, why does the process continue even though Netscape/IE reports that the "page cannot be displayed". This leaves zombie processes when programmers make mistakes. For instance, say you neglect to put in a "(last) if (eof)" statement in a "while(1)" read loop (bad, bad, person!!), the process continues in an infinite loop and the programmer is none the wiser, leaving the poor processor to contend with multiple looping CGI scripts. On a payroll day this could get you fired :-) Anyone have suggestions? The "while(1)" loop is just an example, so I am not looking for a solution to that because it is obvious :-) Thanks for your help Ian

Replies are listed 'Best First'.
Re: Perl infinite loops!
by kilinrax (Deacon) on Dec 06, 2000 at 20:36 UTC
    Somewhere near the top of your cgi script you could put something like the following:
    alarm(180); # stop script after 3 minutes
    Which will kill the cgi script's process (with a SIGALRM) 3 minutes after execution of that line of code.

    You could even go as far as printing a warning message if the script is killed this way:
    $SIG{ALRM} = \&sighandler; sub sighandler() { print "WARNING: Script timed out or was killed\n"; exit (0); }
      It would be worth logging that to a file, or at least terminating the html fully. You wouldn't want a "Document contains no data" situation and no message logged.

      In order to do that, you may have to keep track of where you are. There may be other things to do before you can terminate the html or the script.

      This is not as simple as it seems.

      --
      Brother Marvell

      This worked great, but I can't get the handler to work. It gives me a "Spurious backslash ignored" on the "$SIG{ALRM} = \&sighandler;" line. I'm not an advanced Perl programmer, so I'm not quite sure where to go, but the alarm() call worked well. Thanks. Ian. BTW, we're still on Perl 4.xxxx (Aaarrrrgh! Don't say anything about this!) :-)
        Iirc, the syntax in Perl 4 is:
        $SIG{ALRM} = 'sighandler';
        Though i could easily be wrong about that :-/
Re: Perl infinite loops!
by AgentM (Curate) on Dec 06, 2000 at 21:20 UTC
    The browser is a very poor debugging mechanism for CGI simply because it displays nothing about errors unless you are sure to throw in CGI::Carp (which is recommended). Also, if you screw up your headers, you'll have NO idea of what's going on since the web server or browser will end up with some confusing info. In short,
    use CGI; use CGI::Carp qw/fatalsToBrowser set_message/; #read more about these +in the docs link above alarm($x); #if you wish, though I consider this a quick hack since a h +eavy server load may actually force your users to wait this long even + though there is no infinite loop and still get the info they wanted +cut off
    The best solution is to test all CGI beforehand with interactive mode or the debugger ("or" is not a mutually exclusive term- that's "xor" which i would have written if I had meant that). Interactive mode is enabled when you run your CGI script from the command line. This also allows you to easily pass batch CGI params by storing them in a file instead of typing and retyping them in a browser.
    perl cgi.cgi<batch\ file\ with\ \\n\ delimited\ CGI\ params.file
    (Oh geesh! I wrote coed instead of code before- I wonder what I was thinking about....)
    AgentM Systems nor Nasca Enterprises nor Bone::Easy nor Macperl is responsible for the comments made by AgentM. Remember, you can build any logical system with NOR.
      wouldn't a high $x from alarm($x) be worthwhile to look at? If you know what your script does, ideally you know how long it "should" take to run the script, multiply that number by 5 or 10 to account for high server load, and voila.

      I'm curious as to what duration of time the CGI has to run in infi-loop until Apache Zombies the process. I know I've accidentally done that, realized it after about 30-45 seconds, then popped onto the server to SIGKILL the offending perl process.

      -marius
        You can use mod_perl or FastCGI to limit cgi runtime, reload progs against memory leaks, limit the number of scripts to be running at any time, load shared objects, preload CGIs (almost like multithreading) and reduce server load. Nowadays, there is no reason to go without such insanely useful Apache mods. My argument against the alarm is that it is A solution but not the best solution. The best is simply to remove the infinite loop or memory leak. If you just throw in an alarm, your script will be killed, perhaps for now reason at all while you searching desperately for the error. Also, this is not portable in the sense that the script may take 10 seconds on computer #1 and 400 seconds on computer #2. This is not good. Throwing in dependence on time is not a good way to ensure user-friendliness. Using the FastCGI mechanism rather than alarm also is smarter, since FastCGI is not script dependent but server dependent and the server isn't likely to be moving around much, thus the same functionality via a better channel.
        AgentM Systems nor Nasca Enterprises nor Bone::Easy nor Macperl is responsible for the comments made by AgentM. Remember, you can build any logical system with NOR.