in reply to Re: arrays : splice or undef ?
in thread arrays : splice or undef ?

Though I haven't tested it, I assumed from the start that line-by-line processing would slow script execution due to the number of lines. The largest file I've run so far is 231 MB, with over 3.7 million lines.
After the file is loaded, I do iterate each line individually. Meaning, subroutines with more arrays. The additional processing arrays are of course subsets of the file array, and another area in which I'm trying to avoid excessive disk I/O ( paging on Windows ).
Dyslexics Untie !!!

Replies are listed 'Best First'.
Re^3: arrays : splice or undef ?
by davido (Cardinal) on Jun 04, 2013 at 19:37 UTC

    On my system it takes about 63/100ths of a second to read line by line through a file of 3.5 million lines that is 272 megabytes in size (that's the closest to 231MB and 2.7M lines that I happened to have laying around). That's with a no-op loop; whatever you do to process the lines of the file will consume time too, but they will consume virtually the same time whether you're iterating over lines from a file, or the elements of an array.

    If performance is an issue, profile.


    Dave