in reply to Read from a Linux pipe hangs.

By my understanding reading from STDIN (or any file handle) in array context will read everything untill an end of file character is found. That means that the "consumer" program will wait on the @lines = <STDIN>; line until the producer program from which you are piping information finishes running (which producer.pl never does). I believe you can get arround this probelm using IO::Select. Something like the following:

while (1) { # This tells us if the processReader has data for us to read if(@ready=$select->can_read(0)) { # Read the data, if it fails, its because we reached # the End of file. if(read($ready[0], $tmp, 128)) { $read_flag=1; } # Add the output from the process to the variable $output.="$tmp"; # We don't have any more to read, break out of the loop if(eof($ready[0]) ) { last; } }else{ sleep(2); } }
May the Force be with you

Replies are listed 'Best First'.
Re^2: Read from a Linux pipe hangs.
by dmor4477 (Initiate) on Feb 23, 2005 at 17:25 UTC
    I tried usinf IO::Select adding STDIN: 
    $select = IO::Select->new();
    $select->add(\*STDIN);
    
    But the 'if(@ready=$select->can_read(0))' is always false.
    Is there any methos to read from a FILE_HANDLER line by line instead of reading till the end of file?
    

      Is there any methods to read from a FILE_HANDLER line by line instead of reading till the end of file?

      To read from a filehandle a single line at a time, simply do so in scalar context.

      my $line = <STDIN>;

      will read a single line from STDIN. Note: "Line" is defined based on the value in $/. This defaults to "newline" which is defined based on the operating system perl is running under. If you wish to controll the character which is used to determine what is the end of a "line" you can set $/ (however this should be done in limited scope, probably via local).

      May the Force be with you