in reply to Dealing with huge log files 100 MB to 1 GB

Pre-compute the values, and then access the computed values from the web interface.

Reading 1GB of log files from disc is already slow, independently of how fast Perl is with processing the data - it's still likely IO bound. So the solution must be not to read the whole log file in response to an action from the web interface.

Perl 6 - links to (nearly) everything that is Perl 6.
  • Comment on Re: Dealing with huge log files 100 MB to 1 GB

Replies are listed 'Best First'.
Re^2: Dealing with huge log files 100 MB to 1 GB
by DStaal (Chaplain) on May 17, 2010 at 15:24 UTC

    To further the above: Think database. The work process you want is 'Read/compute/store' then 'query/display'. (This assumes you don't want exactly the same statistics every day, in which case just generate a static web page with that when the log files roll over.)

    Databases work with massive amounts of data quite well: I deal with multiple logs that are about 1GB compressed every day, and once they are parsed and in the database, data return is nearly instant.

    And, luckily enough, Perl has good database support for just about any database you'd ever want to use (and a few you wouldn't).