blax has asked for the wisdom of the Perl Monks concerning the following question:

Hello,

First of all, I think this code will probaly be system dependent so you should know I am trying to get this to work on GNU/Linux. Ok, here is my problem: I want to write a script that will keep two procecess going at all times. To be more verbose, it will start two processes, wait for one of them to exit and then start another process when it happens. I hope I have been clear enough.

I am aware of the fork and waitpid functions, but I cannot think of anyway to get this working. Any help will be very appreciated.

Thank you,
blax

Replies are listed 'Best First'.
Re: Keep Two Processes Going
by Zaxo (Archbishop) on May 29, 2004 at 04:53 UTC

    Depending on the details of what you want from the kids, there are several ways.

    1. Parallel::ForkManager lets you set a maximum number of processes which all have the same code. When one exits, another will appear.
    2. A $SIG{CHLD} handler can arrange to replace a finished process with another.
    3. A wait loop can fork a replacement.

    What do your processes do? It's not clear from your description that either process needs to exit until the work is done.

    After Compline,
    Zaxo

      Hello,

      First off, thank you for your reply. I believe signals will work gret.

      <snip> It's not clear from your description that either process needs to exit until the work is done. </snip> I am not trying to be an asshole, but I didn't think I needed to be clear as to why I needed two processes running at the same time. I thought I could just say it. But since it may help, here is my response.

      The process is nget. Nget is a command-line usenet binary grabber. The reason that I want two ngets to run at the same time is so that I have two connections to my usenet provider. Without at least two ngets running I don't get as good of download time (Retrieving articles then assembling takes some time away from downloading).

Re: Keep Two Processes Going
by BrowserUk (Patriarch) on May 29, 2004 at 06:59 UTC

    You could do it the easy way.

    #! perl -slw use strict; use threads qw[ async yield ]; use Thread::Queue; my $Qrv = new Thread::Queue; my @work = qw[ C: D: P: Q: T: U: V: W: Z: ]; for ( 1 .. 2 ) { my $cmd = join ' ', 'dir /s', shift( @work ), '>nul'; async( sub{ $Qrv->enqueue( "$_: " . system( $cmd ) ); } )->detach; } while( @work ) { my( $threadNo, $rv ) = split ': ', $Qrv->dequeue(); my $cmd = join ' ', 'dir /s', shift( @work ), '>nul'; async( sub{ $Qrv->enqueue( "$threadNo: " . system( $cmd ) ); } )-> +detach; } sleep 2;

    A slightly more verbose version shows the workings


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail
Re: Keep Two Processes Going
by thospel (Hermit) on May 29, 2004 at 11:41 UTC
    Personally I try to avoid the use of signals in cases like this because they are dangerous in old perls and hard to get racefree in new perls (you need to take care of the case that a process exits before you properly stored the pid).

    Instead of that I usually try to have a pipe into the processes and poll/select on them. Most programs don't close the standard handles, so you can actually use these. In this case I use STDIN.

    The following code demonstrates the idea. Modify it to fit whatever your constraints are.

    #! /usr/bin/perl -w use 5.008; use strict; use IO::Select; # program and args my @work = qw(sleep 3); my $wanted = 2; my $s = IO::Select->new; my $have = 0; while ($wanted) { while ($have < $wanted) { open(my $fh, "|-", @work) || die "Could not fork: $!"; $s->add($fh); $have++; } for ($s->can_read()) { $s->remove($_); $have--; close $_; die "Unexpected returncode $?" if $?; # Real code may change $wanted here } }