Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
I am a mere Novice, but having great fun learning this language. I am currently processing some large text files....
Pseudo Code reads
I am noticing a slow down of adding record when the size of the main list is 1.5 Million records; The list processing then speeds up when 2.3 Million are reached, and seems quite happy until 7 Million, when my server runs out of memory.While count < nRecords { Read in File Add File to List increatemt count } Sort List Dump List to New File
Is the performance a Perl thing, or Windows tryng to scavnge memory prior to using its swap files ?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Large Array Performance
by shmem (Chancellor) on Jun 16, 2007 at 10:08 UTC | |
|
Re: Large Array Performance
by GrandFather (Saint) on Jun 16, 2007 at 10:16 UTC | |
by salva (Canon) on Jun 17, 2007 at 15:25 UTC | |
|
Re: Large Array Performance
by swampyankee (Parson) on Jun 16, 2007 at 17:55 UTC |