It is true that foreach over (@array) uses the original array as a list. It is also true that foreach over (1..1_000_000) does not build a list of a million entries. But the file construct most assuredly does slurp a file. Try running the following command-lines interactively to show that:
vsperl -e '$| = 1; print while <>'
You will find that the first spits back your lines interactively. The second has to wait to slurp up everything you have to say before it starts printing.perl -e '$| = 1; print foreach <>'
Should you ever need to write a filter for dealing with a large amount of data, be very, very careful to use while instead of foreach!
But a random note. If you check the benchmark, you will find that keys ran faster, 423.73 iterations per second vs 356.01. If memory is not a constraint, then for hashes of this size, foreach is faster!
In reply to RE (tilly) 3: each or keys?
by tilly
in thread each or keys?
by rdw
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |