in reply to Slurping file into array VS Process file line-by-line
The first example never makes sense in the given form. If you are processing stuff sequentially, from beginning to end, it makes sense to use the file iterator instead of slurping it into memory, as by that way, you will not use as much system memory and perls buffering will handle the reading from the file in a relatively efficient manner.
As soon as you need to go back and forth between lines, and jump backwards in the files, then the first approach of reading the whole file into memory makes the algorithm much easier to write, at the cost of some system memory. If system memory is scarce and you are able to trade a bit of time against memory, take a look at Tie::File, which lets you access a file on disk just like an array in memory.
|
---|