asdfgroup has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I'm try to spawn long-running process with help of following code :
redirect_to("Prev_page.html") if fork(); close STDIN; close STDERR; close STDOUT; use POSIX 'setsid'; setsid or die 'cant start new session'; # Long-running code here
Unfortunatelly it mystically stops working after ~10 mins.
This code from command-line works without any problems.
Can anybody point me in right direction ?
PS: my system settings
-bash-2.05b# uname -a FreeBSD www3 4.8-RC FreeBSD 4.8-RC #0: Tue Mar 11 15:00:17 GMT 2003 + netwave@www3:/usr/obj/usr/src/sys/GENERIC i386 -bash-2.05b# perl -v This is perl, v5.8.0 built for i386-freebsd ... [Tue Mar 11 16:24:04 2003] [notice] Apache/1.3.26 (Unix) mod_perl/1.26 + mod_throttle/3.1.2 PHP/4.2.2 mod_ssl/2.8.10 OpenSSL/0.9.6f configure +d
Sincerely, Nikita Savin

Replies are listed 'Best First'.
Re: Problem with for in CGI
by robartes (Priest) on Apr 25, 2003 at 11:54 UTC
    One reason for programs to mystically stop working when run in a different user context (CGI user versus yourself in this case), is that the OS imposes some kind of limit (CPU time or wallclock time in this case, most likely) on processes owned by said user.

    That said, it would help very much, as PodMaster indicated, to reopen STDERR somewhere useful so you might actually get a helpful message.

    CU
    Robartes-

Re: Problem with for in CGI
by PodMaster (Abbot) on Apr 25, 2003 at 10:32 UTC
    Try this
    close STDERR; die " die die die "; __END__
    I wonder why that didn't print anything, hmmm ;)


    MJD says you can't just make shit up and expect the computer to know what you mean, retardo!
    I run a Win32 PPM repository for perl 5.6x+5.8x. I take requests.
    ** The Third rule of perl club is a statement of fact: pod is sexy.

      Hmm, yes, copy-paste will ruin the world.
      And of course, you are right - this die will write nothing.
      But please read question again
      Question was - why program stop execution mystically
      I don't ask why die don't output nothing ;)
Re: Problem with for in CGI
by Notromda (Pilgrim) on Apr 25, 2003 at 15:30 UTC
    Since it was forked by a webserver process, it has the limitations that the webserver has, which could involve ulimits on CPU time.

    One way to get around this might be to use the "at" scheduler. look at the man page for "at" on almost any unix box.

Re: Problem with for in CGI
by marinersk (Priest) on Apr 25, 2003 at 19:37 UTC
    The ten minute limit is a big red flag -- very fishy. That's precisely how long it took my CGI scripts to time out when I was using Perl2Exe (since the web hoster didn't support Perl in its native (source-readable) state). I was trying Perl2Exe to see if it would work, and so I was obviously using it in an unregistered mode.

    Turns out the unregistered version (hacks listed in the snippet section to get around this notwithstanding) puts up a prompt window at the end of every execution. There was nobody at the web hosting site to repond to it. To prevent long-running processes, they had the system set up to kill any CGI processes that ran 10 minutes.

    Bet you're facing the same kind of thing. You're forking off a long-running process/thread/whatever and something else is saying "Ah, you run too long, I'mma' gonna' whacka' you now." and off you go to PID-Purgatory.

    Just a guess.