in reply to launching concurrent processes

What did you try exactly? Anyway, try to look at perldoc -f fork and perldoc -f exec.

Flavio

Don't fool yourself.

Replies are listed 'Best First'.
Re^2: launching concurrent processes
by Anonymous Monk on Mar 30, 2005 at 15:20 UTC
    When using fork and exec, is there a way of not waiting for the child to complete? What I mean is that have fork create 3 clones of itself and not single or not wait for the child to finish and fork another child process and then round them up in the end.
      I really find it difficult to understand what you're asking, sorry for this.

      When fork() is called, you end up having two processes: the parent and the child. They both execute the same code (your Perl script), but you can understand if you're in the child testing whether the call returned 0. Then, you're supposed to call exec() (if you really need it) passing the control to the process you want to execute, but this isn't necessary if the code is in the parent script!

      The two processes created by fork() are independent, so they will be both running after the call, if this is what you're worried about. Unlike system(), which waits for the child process to complete, after fork() you can do whatever you want in the parent process, even calling fork() two more times to complete the creation of your three subprocesses.

      Simple snippet:

      #!/usr/bin/perl use strict; use warnings; sub child { print("Hey, I'm child $$\n"); sleep(2 + rand(5)); print("$$ exiting...\n"); exit(0); } foreach (1 .. 3) { my $pid = fork; die "fork(): $@, stopped" unless defined($pid); child() unless $pid; } # Only the parent reaches this point wait foreach (1 .. 3);
      which yelds, for example
      Hey, I'm child 1048 Hey, I'm child 1368 Hey, I'm child 424 1368 exiting... 424 exiting... 1048 exiting...

      Flavio

      Don't fool yourself.
        Thanks for your time and sorry if I am getting you confused. What I basically need is to make 3 system type calls to run 3 seperate c programs at once from a master perl script and then ammalgumate the ouput from the 3. The reason for having concurrent jobs is to trick openmosix into loadbalance the jobs on a cluster. I was under the impression that perl fork would wait to finish the child process as it would with a system call.
        Thanks for your help. You have cleared the confusion for me. I have managed to get it working as I want it to. Many thanks.