If your 12 million records average less than a couple of kbytes each (ie. if the size of the records file is less than your available memory)Is 12GB a normal amount of memory for a single process to use these days? My sense was that 4GB was standard on an entry-level desktop or a mid-level laptop. Even if you have a super-machine with 16GB, you may not want to have a single process suck that all up to run an O(n^2) program. A hash containing the smaller file or two on-disk sorts would be a much better option, and not that hard to do.
In reply to Re^2: Faster grep in a huge file(10 million)
by educated_foo
in thread Faster grep in a huge file(10 million)
by Thomas Kennll
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |