gnu@perl has asked for the wisdom of the Perl Monks concerning the following question:
I am starting a program for managing and monitoring log files on various flavors of Unix/Linux. The objectives are as follows:
* To be able to view log files or the last X number of lines from them.
* Rotate theses files on demand instead of waiting for logrotate (some machines don't have it).
* Retrieve different files from different servers.
These are the basic tasks I must perform, other 'spiffy' things will probably be added as time goes on.
What I am looking for is other ideas on how to go about getting the data from the remote servers. I am limited by not being able to have any sort of server process running on the remote machines. I will have both telnet and ftp access to those machines.
The first idea I had was to use open3 to telnet to the remote server and either cat or tail the logfile. This proved to be problematic. I then switched to using Net::Telnet. This worked fine in my original trials, but I am concerned about hangs. I have not yet tried ftp simply because I do not want to bring the file over to the local system, I just want to view it's contents, but it is still an option.
Essentially what I wanted to do was get the data from the remote server and store it in a hash with the servername:file as the key to the data. I am aware that this might also be a problem if someone tries to bring a 1GB file, but I am still working on the safety mechanisms for that. First I need to focus on how I am even going to get the data before I make it 'safe' to do.
Any ideas on how I can handle this unique circumstance?
Thanks in advance, Chad.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Viewing log files on remote servers.
by cLive ;-) (Prior) on Oct 08, 2002 at 20:15 UTC | |
|
Re: Viewing log files on remote servers.
by grinder (Bishop) on Oct 08, 2002 at 21:06 UTC | |
by gnu@perl (Pilgrim) on Oct 08, 2002 at 21:26 UTC |