Stormwolf has asked for the wisdom of the Perl Monks concerning the following question:
#!/usr/bin/perl -w use strict; use IO::Select; use IO::Handle; my $selector; my @ready; my $fh; my $line; open(A,"tail -0f file.1 2>&1 |"); A->autoflush(1); open(B,"tail -0f ~/Quake2/baseq2/qconsole.log 2>&1 |"); B->autoflush(1); $selector = IO::Select->new(); $selector->add(*A,*B); while(@ready = $selector->can_read) { foreach $fh (@ready) { $line = <$fh>; if(fileno($fh) == fileno(A)) { print $line; } else { print $line; } } } close(B); close(A);
All the tail command does, it read from the quake log (this is the console app). The other pipe is simulating another connection (to the IRC server). This system works... until a large amount of data gets sent at once.
I wrote a script to add 1 line at a time to the file.1 file (which is open as A). When I do this, it prints out in the main scripts console (as it should). But if I spam the file (ie, add lots of lines very quickly) and then stop suddenly, the file is updated, but the main script only prints to a certain point. If I then wait a second, and add 1 more line to the A file, the main script will output all the stuff that was "stalled", along with the new line. Because the quake server prints stuff out like this sometimes (lots of lines quickly) I have the problem with that as well.
At first I thought this was a problem with the tail command, but after some testing (if I just do a normal tail, without any perl, that is uptodate OK) I realised that it's perl. From what I can see, when it is spammed, the bufferbuilds up, until it calms down, and even then it needs a little "nudge" to get it out.
What am I doing wrong, and how can I fix it?
TIA,
- Stormwolf
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Reading multipul pipes
by matija (Priest) on Mar 29, 2004 at 20:45 UTC |