Capt_Howdey has asked for the wisdom of the Perl Monks concerning the following question:

I am using this sub command I got from here (it works great). I do have one problem though. Some of the commands I am submitting can take up to 5 minutes to complete. The problem is that the tk gui locks up until the commands finish (5 minutes). I need a way to submit all of these and not lock up the gui, but still gather STDERR when they finish. If I run them in the background (&), the gui does not hang, but I do not get STDERR back (it gets reported in the shell the progrma was launched from).

Replies are listed 'Best First'.
Re: more questions about &run_parallel
by RMGir (Prior) on Mar 18, 2002 at 16:24 UTC
    Hmmm, I'd need to see the original run_parallel, I think.

    Did you try using "2>&1"?

    #!/usr/bin/perl -w use strict; my $x=`ls a.a 2>&1`; print "dollar x is $x\n"; print "***************\n";

    That captures the stderr output nicely.

    But I'm not sure this trick will work with whatever underlies your run_parallel subroutine.

    Could you post a pointer to the node where you found it?
    --
    Mike

      http://www.perlmonks.com/index.pl?lastnode_id=147112&node_id=28870
Re: more questions about &run_parallel
by RMGir (Prior) on Mar 19, 2002 at 12:22 UTC
    Thanks for the pointer to run_parallel.

    The line in run_parallel that says:

    my $proc_id = open3("<&NULL", ">&STDOUT", ">&STDERR", @$job);
    is what actually runs your program. As you can see, STDOUT and STDERR for the program are redirected to YOUR STDOUT and STDERR.

    You have 2 choices (that I can see).

    First, change run_parallel to redirect the output to different file handles.

    Second, close STDOUT and STDERR in your main program before calling run_parallel, and reopen them to point to log files.

    close(STDOUT); open STDOUT,">log.out" or die "Can't open log.out, erro +r $!"; close(STDERR); open STDERR,">log.err" or die; # can't really print a m +essage, STDERR is closed

    Hope this helps!

    Oh, by the way, since all of these jobs are writing to the same file, there's always the chance you could get some kind of simultaneous writes which could mess you up. You might want to consider modifying run_parallel so it opens up DIFFERENT log and error files for each job...
    --
    Mike