in reply to capture stdout and stderr from external command

Have you seen this before?
#!/usr/bin/perl use warnings; use strict; use IPC::Open3; use IO::Select; my $pid = open3(\*WRITE, \*READ,\*ERROR,"/bin/bash"); my $sel = new IO::Select(); $sel->add(\*READ); $sel->add(\*ERROR); my($error,$answer)=('',''); while(1){ print "Enter command\n"; chomp(my $query = <STDIN>); #send query to bash print WRITE "$query\n"; foreach my $h ($sel->can_read) { my $buf = ''; if ($h eq \*ERROR) { sysread(ERROR,$buf,4096); if($buf){print "ERROR-> $buf\n"} } else { sysread(READ,$buf,4096); if($buf){print "$query = $buf\n"} } } } waitpid($pid, 1); # It is important to waitpid on your child process, # otherwise zombies could be created.

I'm not really a human, but I play one on earth.
Old Perl Programmer Haiku ................... flash japh

Replies are listed 'Best First'.
Re^2: capture stdout and stderr from external command
by pankajadvani (Initiate) on Nov 11, 2011 at 19:31 UTC

    I am running these external commands in Windows and I want to run 60 of the 5000 external commands that I have at any given point of time. When one of the external commands is done I need to run another such that I have 60 external commands executing at any given time. Thats the reason why I have been using perl's forkmanager which forks the process and limits them to the max procs (60 in my case). Now all I need is to get the stdout and stderr of these external commands which run from the forked child process.

    my $max_procs = 60; my $pm = new Parallel::ForkManager($max_procs); foreach my $child ( 0 .. $#cmds ) { my $pid = $pm->start($cmds[$child]) and next; # This is where I need to get the stdout and stderr. # cmds can be external command which can be windowsexecutable.exe +$args (for example perl.exe script.pl $arg1 $arg2) system("cmds"); my $Result = $? >> 8; $pm->finish($Result); # pass an exit code to finish } $pm->wait_all_children;

    The above sample code set the $max_procs to 60 and forks 60 child process each executing one system command. All I need is to get the stdout and stderr of the executable executed using the system command into some perl variable

      Have a look onto POE::Wheel::Run, it allows you to run a lot of children and capture their outputs. You have to learn a bit about POE first though.

      All I need is to get the stdout and stderr of the executable executed using the system command into some perl variable

      Your problem is that forking puts the $cmd into a different $pid, and if you put your stdout and stderr into a perl variable, it won't be seen in the parent. You will need to open some pipes from the children back to the parent, to write back your returns. Or, you could use threads or some other form of IPC, like shared memory segments. See forking with Storable and IPC::ShareLite


      I'm not really a human, but I play one on earth.
      Old Perl Programmer Haiku ................... flash japh
      Windows can also do 2>&1 like bash. So how about like this?
      $pm->start($cmds[$child] . " > /tmp/$$.$child.log 2>&1")
      $child may be unique through commands. And collect the outputs afterwards?