in reply to Re: Dealing with huge log files 100 MB to 1 GB
in thread Dealing with huge log files 100 MB to 1 GB

To further the above: Think database. The work process you want is 'Read/compute/store' then 'query/display'. (This assumes you don't want exactly the same statistics every day, in which case just generate a static web page with that when the log files roll over.)

Databases work with massive amounts of data quite well: I deal with multiple logs that are about 1GB compressed every day, and once they are parsed and in the database, data return is nearly instant.

And, luckily enough, Perl has good database support for just about any database you'd ever want to use (and a few you wouldn't).

  • Comment on Re^2: Dealing with huge log files 100 MB to 1 GB