10 variables per iteration is nothing. In addition, Perl buffers I/O by default, so there'd be little difference even for large data sets (in which case the array method would consume more memory).
Additionally, the network I/O will probably dominate your runtime by a few orders of magnitude in comparison to anything else, so even if you managed to make any significant local difference with these micro-optimizations, the script still wouldn't run appreciably faster.
That said, I prefer to separate my code into phases where I fetch input, munge data or do calculations, and produce output, simply because it helps separation of concerns and therefore makes the code more maintainable.
Makeshifts last the longest.
In reply to Re: most efficient way to scrape data and put it into a tsv file
by Aristotle
in thread most efficient way to scrape data and put it into a tsv file
by dchandler
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |