thelamb has asked for the wisdom of the Perl Monks concerning the following question:

I am trying to parse a lot of game server log files to get some relevant info(New connections, player kicks etc.). I have no experience in perl whatsoever so I could use a little help.

My questions are:
How do I 'import' the regular expressions I want to search from a file? Let's say I have 20 regexps I search for and they are in a file 'searchlist'.

----
The file structure where the logs are is:
~/logs/serverIP_PORT/date.log<br/> e.g.: ~/logs/1-1-1-1-2222/20080917.log

Is there a way to search all .log files with a specific date, regardless of the serverIP_PORT folder they are in? So instead of looping through all the serverIP_PORT folders -> do it in one go.

----
I need to add some information infront of every line that contains one of the regexps. For example there is a new connection:
[2008-01-01 11:00] New Connection (slot #1) name IP
The perl scrips will find this(it searches for New\sConnection) but it needs to add the serverIP infront of this line(it gets the serverIP from the folder where the .log file is in, see previous question).
Does this limit me to line-by-line searching of every logfile(rather than: perl -ne 'print if (m/(New\sConnection)/i)' test.log)

I'm sorry if I'm asking too much, but my time is rather limited here so learning the ins and outs of perl isn't really an option at the moment. If anyone could push me in the right direction (especially for the first and last question ^^) I'd be very grateful

Replies are listed 'Best First'.
Re: Log parsing question
by moritz (Cardinal) on Sep 17, 2008 at 11:26 UTC
    Globs can help you with your file name problem: perl -e '....' ~/logs/*/20080917.log

    The file name that is currently being processed is then stored in $ARGV, which should solve your second problem.

Re: Log parsing question
by eighty-one (Curate) on Sep 17, 2008 at 13:25 UTC
    This pertains to the second item you bring up. I'm not 100% sure why you need to 'do it all in one go' as you say, so this may or may not be helpful. But, adapted from a node by an anonymous monk posted here, you can generate a list of directories within a specified directory.

    You can use this to make your script adapt in case the number of or name of directories varies from run to run.

    This:
    #!/usr/bin/perl -w use Data::Dumper; while(<*>){ push(@files,$_) if(-d "$_"); } print "\n". Dumper(@files) ."\n";
    will produce output like this:
    sean@seanc:~/code/temp$ ls -l total 16 drwxr-xr-x 2 sean sean 4096 2008-09-17 09:12 dir1 drwxr-xr-x 2 sean sean 4096 2008-09-17 09:12 dir2 drwxr-xr-x 2 sean sean 4096 2008-09-17 09:12 dir3 -rwxr-xr-x 1 sean sean 155 2008-09-17 09:11 dir_list.pl -rw-r--r-- 1 sean sean 0 2008-09-17 09:12 file1 -rw-r--r-- 1 sean sean 0 2008-09-17 09:12 file2 -rw-r--r-- 1 sean sean 0 2008-09-17 09:12 file3 sean@seanc:~/code/temp$ ./dir_list.pl $VAR1 = 'dir1'; $VAR2 = 'dir2'; $VAR3 = 'dir3';
    so you can always have an up-to-date list of subdirectories, and the code looks prettier than having a big, long hard-coded list.

    Changing the '-d' to a '-f' would give you a list of files, so you could adapt that to get a list of directories, then get a list of filenames or check for the presence of a particular file.
Re: Log parsing question
by apl (Monsignor) on Sep 17, 2008 at 11:44 UTC
    How do I 'import' the regular expressions I want to search from a file?
    Read them into an array.