in reply to How to merge Huge log files (each 10 MB) into a single file

Anonymous Monk's advice is probably the best solution, but if you're looking to speed up a perl solution, it might be faster to just slurp each file instead of reading line by line. You can do this by setting $/ to undef, or by using File::Slurp.

That being said, wouldn't it make more sense to actually merge the files based on the timestamps of each log entry?

Replies are listed 'Best First'.
Re^2: How to merge Huge log files (each 10 MB) into a single file
by lnin (Initiate) on Sep 03, 2009 at 14:23 UTC

    I am working on windows platform, not able to try the cat command

      cat works fine in Windows too provided you install it, say, as part of cygwin. But you can also do

      copy /b FILE1+FILE2+FILE3+FILE4 bigone
      Under windows, you would use type instead of cat: type file1 file2 file*.* > output.txt