infidel2112 has asked for the wisdom of the Perl Monks concerning the following question:
I have a script that has to parse and generate stats from some HUGE logfiles. It's written so that I keep bare min content of logfile in memory for least amount of time.
The problem is that garbage collection doesn't seem to happen, after the array holding the log data goes out of scope.
Specifially, I watch the mem usage grow on read by looking via top, but the mem usage never decreases once the array is out of scope, or even if i undef it in debugger. It's about 20M min of mem usage post read, so a drop should be noticeable
I'm SURE there are no references to it (I don't make any, i just count different items inside the lines), and even tried making it an object to see if garbage collection would do its thing, but that didn't help.
I'm doing this via ssh and I have also tried undefing the $ssh object after one file read, to no effect (memory usage still stays high)
I'm doing something like:
foreach my $log ( @log_names ) { my ($stdout, $stderr, $exit) = $ssh->cmd( "cat $log " ) my @logdata = split /\n/, $stdout; undef $stdout; while ( my $line = pop @logdata ) { # parse/collect stats on $line } }
thanks for any suggestions or help!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
•Re: garbage collection not happening?
by merlyn (Sage) on Nov 04, 2004 at 12:57 UTC | |
|
Re: garbage collection not happening?
by reneeb (Chaplain) on Nov 04, 2004 at 12:59 UTC | |
|
Re: garbage collection not happening?
by dragonchild (Archbishop) on Nov 04, 2004 at 13:46 UTC | |
|
Re: garbage collection not happening?
by TedPride (Priest) on Nov 04, 2004 at 13:39 UTC |