loris has asked for the wisdom of the Perl Monks concerning the following question:
Hello Knowledgable Ones,
I have around 40 logfiles of about 15 MB each. Around 30 processes write willy-nilly into these files, whereby each line contains text which identifies the process. I would like to untangle the log files to produce a single file for each process.
Naively I could try to slurp all the logfiles and then just use grep or split to get the process ID and then write to the appropriate new log file. However, I suspect that I might have some memory issues slurping all the data, but apart from that I would like to know what would be a more scalable approach.
Any ideas?
Thanks,
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Untangling Log Files
by davorg (Chancellor) on Feb 08, 2007 at 12:14 UTC | |
by sauoq (Abbot) on Feb 08, 2007 at 16:03 UTC | |
by davorg (Chancellor) on Feb 08, 2007 at 16:08 UTC | |
Re: Untangling Log Files
by jettero (Monsignor) on Feb 08, 2007 at 12:11 UTC | |
by Util (Priest) on Feb 08, 2007 at 15:53 UTC | |
by loris (Hermit) on Feb 09, 2007 at 07:23 UTC | |
Re: Untangling Log Files
by Moron (Curate) on Feb 08, 2007 at 12:47 UTC | |
Re: Untangling Log Files
by kwaping (Priest) on Feb 08, 2007 at 17:36 UTC | |
by loris (Hermit) on Feb 09, 2007 at 07:19 UTC | |
by kwaping (Priest) on Feb 09, 2007 at 12:58 UTC |