in reply to Interlaced log parser

As someone who's recently had to solve this (well, slightly different: I didn't have to do real-time) problem at work: Reading over and over is going to be slow. Probably slower than what you want to deal with.

To actually get decent speed I finally moved the assembly of transactions into the database: I insert each line as I read it, letting the database record fill in as it gets information. (Note that you need some way to know which record each line is going to be in, and identify it uniquely over the course of at least a log file. Depending on the format of the logs, it may be worth keeping track of just the 'active' transactions.)

Doing this allowed me to process 1-1.5GB of a data a day in a couple of hours. (Despite having to use a temp database, since the DBD install for the main database is borked beyond repair.)