in reply to is this the most efficient way to double-parse a large file?
How big is your "large logfile"? If it's less than 1/2 the memory in your computer then simply keep all the client data in a hash, only parse the file once and print the report using the triggered client's data at the end.
If the file is too big for the memory based approach and memory size > (20 * clients * client entries) then hdb's solution should be fine, otherwise either a database as suggested by hdb, or parse the logfile once, but write each client's data out to its own file in the report format you need then process the triggered client's data after parsing your logfile. Note that you need to take care not to open too may file handles with this last approach!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: is this the most efficient way to double-parse a large file?
by jasonl (Acolyte) on Jan 21, 2014 at 16:21 UTC | |
by GrandFather (Saint) on Jan 21, 2014 at 22:03 UTC |