in reply to System out of memory while storing huge amount of data
If I understand your post right, there is some huge hash, then some result set that is generated and being pushed onto an array. It could be that if you wrote the result set to disk instead of pushing onto array, then things might still work out ok for you. undef the hash, which allows Perl to reuse that memory and then do what you have to do with this file of the result set array.
Update: after running BrowserUk's benchmark with 1 million simple records which didn't get anywhere close to taxing the limits of my 32 bit Windows XP box, I'm wondering if there might be some kind of obscure reason why your system reports "out of memory". I vaguely remember something in the past where I had to increase the size of my paging file - I think some apps like Firefox, etc. can be pretty memory intensive and I suppose that more disk space was needed so that these programs which were not being used could be paged out so that my Perl program could use that memory. Anyway just another thought.
|
|---|