I am a mere Novice, but having great fun learning this language. I am currently processing some large text files....
Pseudo Code reads
I am noticing a slow down of adding record when the size of the main list is 1.5 Million records; The list processing then speeds up when 2.3 Million are reached, and seems quite happy until 7 Million, when my server runs out of memory.While count < nRecords { Read in File Add File to List increatemt count } Sort List Dump List to New File
Is the performance a Perl thing, or Windows tryng to scavnge memory prior to using its swap files ?
In reply to Large Array Performance by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |