This question is written at the prodding of the CB (hi
Zaxo!).
I'm tasked to develop something which will consolidate logfiles. Our environment is a diverse group of Windows 2000 (and one NT4) servers, of which only one, my application server, runs ActivePerl.
We have around two dozen IIS webservers, each with up to 8 different sites, hence up to 8 daily logfiles. In addition, we have several SMTP servers, and some other strange logfiles (ColdFusion, for one). These files do not all have a consistent naming scheme, location on server, or relationship between name/location and function.
I need to take all these daily files, optionally compress them, process, rename, and move them to a single server, where they will eventually be moved to tape. My life is made somewhat easier by having the application server also be the logging server. I will be doing a pull from each server to the logging server.
Now, onto the real question. Originally, I had thought I would design a file type with one server / location / function per line, ala:
# my ugly flatfile
#
server1 D:\logfiles\w3svc3 application1
server1 D:\logfiles\w3svc4 application2
server2 E:\logfiles\w3svc3 application1
And then I could
($server, $location, $app) = split; it. There are some other bits of information I will probrally want to include.
But I am wondering: should I make the file be in an XML format and use XML::Parser or the like? What advantages/disadvantages does this have? I am a XML-n00b and unfamilar with it's correct applications.
Will an XML-format help me if I want to add a new data bit, like "compress" or "strip images if HTTP logile"?
Thanks be to you, Oh Hallowed Masters of all that it is Perl-fu...
<-> In general, we find that those who disparage a given operating system, language, or philosophy have never had to use it in practice. <->