hallikpapa has asked for the wisdom of the Perl Monks concerning the following question:

I have two scripts that connect to two different servers, they always stay connected. They both run on the same machine, and the remote servers they connect to have script(s) that wait for the socket connection
CONNECT: $socket = IO::Socket::INET->new("$host:$port"); unless ($socket) { # Potential Problem here $retry++; if ( $retry == 3 ) { log_die("Client : Error connecting to server ... $@\n"); } log_notice("Client : Retry $retry : Sleeping $timeout Seconds\n") +; sleep($timeout); # wait a minute; goto CONNECT; } $retry = 0; #Reset in case we disconnect early $connected = 1; #So that we know...
When the connection is made, it just
print $socket "tail\n"; # Rock n Roll
And starts streaming the live data to the client machine for other fun stuff. The reason it connects to two and tails them is one is primary, and one is failover. So live data will only write to one server at a time. This doesn't seem very efficient to me, and seeking any wisdom from the monks on how I may combine this process into one script that can tell which server is being written to, or even a module that handles situations like this flawlessy! Thanks for any pointers / suggestions.

Replies are listed 'Best First'.
Re: Tailing two files at once through IO::Socket
by cLive ;-) (Prior) on Jan 03, 2008 at 08:03 UTC
    You will probably want to look at POE and POE::Wheel::ReadWrite (which I've used and it does the job fine and will take you about 15 minutes to write :), or Danga::Socket, which I'm sure will also be good (though I've only used it obliquely).