in reply to Re: cleaning up memory after closing files on MACOSX
in thread cleaning up memory after closing files on MACOSX
when I close the FH, some memory gets deallocated, and the RSS size on top goes down, but not completely, so it keeps growing and growing. on Linux this doesn't happen very fast, so my program usually finishes before I run out of memroy, but on OSX the memory grows by leaps and bounds, and I run out of memory about 10-15 passes through the program.$linectr=0; $ctr2=0; for $line (@array){ $hash{$linectr}=$line; if($ctr2=100000){ addtofile(\%hash); %hash=(); $ctr2 = 0; } $ctr2++; $linectr++; } sub addtofile { my $hashref = shift; open(FH,$tempfile); foreach $value (keys %$hashref) { print FH "$$hashref{$value}:$value\n"; } close FH; }
however, I found a solution to my problem using Berkeley DB. Since I am trying to sort files that are too large to put in memory, I thought I would have to break them up and use a merge-sort algorithm, but if I tie an empty file to the Berkeley DB object I can treat the file as an array and insert lines into the middle of the file, so no need to keep opening and closing files.
janitored by ybiC: balanced <code> tags as per Monastery convetion, and a bit o'formatting
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Re: cleaning up memory after closing files on MACOSX
by sgifford (Prior) on Aug 29, 2003 at 02:37 UTC |