in reply to Break up weblogs

In the first case you are processing the log 40+ times.

In the second case you are are accumulating a *lot* of data in memory.

The third option is to open the 40 output files and process the log once, writing to the appropriate file as you determine it. Something like this (untested).

#! perl -slw use strict; my @deptids = qw[ dept1 dept2 dept3 ]; my %fh; open $fh{ $_ }, '+>', "/data/logs/${_}current.log" or die "$_: $!" for @deptids; open (LOGFILE, "/data/logs/access.log") or die $!; while( defined( my $line = <LOGFILE> ) ) { my $match; for( @deptids ) { $match = $_ and last if $line =~ m[\Q$_] }; if( $match ) { ## Note: The {}s around the file handle are required. print { $fh{ $match } } $line; } else { print STDERR "'$line' didn't match any dept"; } } close $_ for values %fh; close LOGFILE;

Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algorithm, algorithm on the code side." - tachyon