cez has asked for the wisdom of the Perl Monks concerning the following question:

I'm trying to find a reliable way to fork a process off of a web connection that stays up after the user's connection has ended/died. ie, user goes to cgi, cgi forks perl script #2, cgi+connection die/end, perl script #2 is still running. exec() does this yes, but I don't think its too reliable in the context of web.. any ideas? so far it seems like I'll have to run a daemon..

Replies are listed 'Best First'.
Re: forking from web
by Aighearach (Initiate) on Apr 23, 2000 at 06:28 UTC

    You are correct that you need to run a daemon, but I don't think there should be any problem with daemonizing in the CGI before you exec(). For example,

    #!/usr/bin/perl -w use CGI qw( header ); use strict; use POSIX qw( setsid ); my $pid; # fork die "cannot fork: $!" unless defined ( $pid = fork() ); # if parent if ( $pid ) { # reply to user print header(), "Process forked."; } else { # we're the child # so we don't prevent filesystems from being unmounted chdir '/' or die "Can't chdir to /: $!"; # redirect IO, or we'll be sucky and make the CGI hang open STDIN, '/dev/null' or die "Can't read /dev/null: $!"; open STDOUT, '>/dev/null' or die "Can't write to /dev/null: $!"; # thank you, POSIX ( I'd have hated to call ioctl() myself! ;-) setsid or die "Can't start a new session: $!"; # lets wait until after setsid to redirect STDERR open STDERR, '>&STDOUT' or die "Can't dup stdout: $!"; # run our background process exec("/path/process"); }
    --4c6966653a205468652073656172636820666f
      7220746861742070657266656374204765656b
      20436869632c20746865206f6e652077697468
      2074686520737061726b6c696e672065796573
      2077686f20706c617973206973207261696e2e
    
Re: forking from web
by chromatic (Archbishop) on Apr 23, 2000 at 08:06 UTC
    I would actually create a listener program, separate from the CGI. It would open a pipe for reading (assuming your OS supports them).

    The CGI would just pass along requests. (Take a look at the perlman:perlipc documentation -- you can write to and read from a pipe just as if it were a regular file.)

    It's easy enough to do, but I can probably dig up some code if you like.

RE: forking from web
by Anonymous Monk on Apr 23, 2000 at 01:59 UTC

    Don't take it from me, but this might not be possible without writing an apache module or something. You might be well served with using IPC of some sort. If it's simple, you can just use signals to contact a daemon in the background to tell it to fork(). Read the perl manpage on signals first, but it's pretty simple:

    
    # in the daemon
    $SIG{'INT'} = \&signal_handler;
    sub singnal_handler {
    # we don't want to do anything too complex in the handler
    # or we risk an ugly death
    $fork_now = 1;
    }
    
    # then just test $fork_now in the main loop of your daemon
    # to fork when it's set (then unset it immediately)
    
    
    # in the calling process:
    kill -2 $pid; # sends SIGINT to $pid.
    

    If you need the daemon to get more complicated information than a simple signal, you should explore other forms of IPC like sockets or SysV message queues and what not. I've used signals with a web board on a daemon which updates semi-static content and things to force it to update or re-read config information and it works quite well.

    Hope that helps.

      sorry, make that kill INT => $pid; or  kill(2, $pid); etc. My bad.