I guess you don't have enough memory for using File::Tie on a 500GB file, since it sluprs the entire file.
If you're only reading from the file (as it seems to be the case here), Tie::File doesn't slurp the file. (I don't know how it handles writes.) It does keep a cache of lines in memory, but the size of the cache is configurable.
What it does do is keep in memory the index of every line up until the last one accessed. scalar(@tied) and $#tied count as accessing the last line. So if you do random(@tied), Tie::File will read through the entire file to build an (in-memory) index of every line in the file. The index could easily take many GBs.
In reply to Re^2: Help performing "random" access on very very large file
by ikegami
in thread Help performing "random" access on very very large file
by downer
For: | Use: | ||
& | & | ||
< | < | ||
> | > | ||
[ | [ | ||
] | ] |