in reply to most efficient way to scrape data and put it into a tsv file
10 variables per iteration is nothing. In addition, Perl buffers I/O by default, so there'd be little difference even for large data sets (in which case the array method would consume more memory).
Additionally, the network I/O will probably dominate your runtime by a few orders of magnitude in comparison to anything else, so even if you managed to make any significant local difference with these micro-optimizations, the script still wouldn't run appreciably faster.
That said, I prefer to separate my code into phases where I fetch input, munge data or do calculations, and produce output, simply because it helps separation of concerns and therefore makes the code more maintainable.
Makeshifts last the longest.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: most efficient way to scrape data and put it into a tsv file
by dchandler (Sexton) on Aug 30, 2004 at 05:03 UTC | |
by Aristotle (Chancellor) on Aug 30, 2004 at 10:55 UTC | |
by iburrell (Chaplain) on Aug 30, 2004 at 17:05 UTC |