in reply to Re: Searching a gzip file
in thread Searching a gzip file

Hi Monks, I need suggestions on the fastest way to do this: I have 3 directories with log files in them. All log files have the following pattern.

< block>

< id> xyz < /id>

< url> foo.com < /url>

..

< response> xyz < /response>

< /block>

< block>

..

The task is to get the id if the url is foo.com from logs of first directory , search for that id in all the directories(including first one) print the responses from the corresponding blocks into a saparate file.
#getting the ids from first directory sub doFile($) { my ($fn) =@_; chomp($fn); print "opening $fn\n"; my $fh = IO::File->new($fn, 'r'); my @msgLines; if( defined $fh){ while(my $l = <$fh>) { push @msgLines, $l; if($l =~ m"</msg>\s*\$") { #my $msg = join('', @msgLines); my $id; if(grep{ m"http://.*foo.com" } @msgLines) { #store the @msglines into an array, this + array can serve as source for searching for reponses from first dire +ctory, need to do something similar for the rest of directories. $id = grep { $_ =~ m"<Id>(\d+)</Id>"; +} #@msgLines; # $id =~ m"<Id>(\d+)</Id>"; push @IDs, $id; } @msgLines = (); } } } else{ die "Cannot open file $!\n";} } my @firstdir=@{$logfiles[0]}; my $path=$logdirs[0]; foreach (@firstdir) { my $curpath=sprintf($path.'/'.$_); print"In foreach trying to open $path\n"; doFile($curpath); }
The log files are huge, so zipping them into a single file is not possible(out of disk space). Any perl modules that can help me with this task?