in reply to Performance Question
Jumping ahead to a solution, I would probably slice the monster file into pieces (lots of ways to do that) then process a couple of pieces in paralell. The way I would test that would be to take a 1G slice of the file and pretend that's the big file, and try various different piece counts.
Failing that, write a program in C (something I've done many times) to suck the file in, 64K chunks at a time (or whatever size chunks your system can manage), then process the lines individually. The processed lines go into a 64K buffer, and when it gets full, you write it to the output file. Piece of cake. :) And you should get great performance doing it in C, better than Perl.
--t. alex
"Nyahhh (munch, munch) What's up, Doc?" --Bugs Bunny
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Re: Performance Question
by BUU (Prior) on May 08, 2002 at 13:57 UTC | |
by Elian (Parson) on May 08, 2002 at 14:02 UTC | |
by talexb (Chancellor) on May 08, 2002 at 14:13 UTC |