A 350MB total-size file can simply fit in memory and be done with it. (I know that you have recently dealt with files that are several orders of magnitude larger.)
Slurped into a scalar, okay. But for the OP's purpose he would need to build a hash from it, and that would require 52.5GB of ram.
Not impossible for sure, but it would (still, currently) take a machine that is a cut (or two) above the average commodity box, many of which the motherboards are still limited to 16 or 32GB.
In reply to Re^4: Indexing two large text files
by BrowserUk
in thread Indexing two large text files
by never_more
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |