wink has asked for the wisdom of the Perl Monks concerning the following question:
My perl is a bit rusty so I'm posting this to make sure I'm doing things the "right way".
I've got a one-to-many list of values that I read in using the following code (the %dtgs hash is defined globally)
open(my $dtg_file, "<", $infile) or die "Unable to open $infile: $!\n" +; while(<$dtg_file>) { chomp; my ($dtg,@files) = split /:/; $dtgs{$dtg} = \@files; } close $dtg_file;
I do some processing and when a match is found to one of the files, I want to remove it to speed up further processing (there are about 60k files and they are being compared to over 100 million files, looking for matches).
sub remove_from_dtgs { my ($dtg,$file) = @_; my @files = grep {$_ ne $file} @{$dtgs{$dtg}}; if(@files == 0) { delete $dtgs{$dtg}; } else { $dtgs{$dtg} = \@files; } }
I want to make sure that I'm not creating a memory leak by replacing $dtgs{$dtg} with the new array. If memory serves (no pun intended), perl will detect there are no longer any references to the old array and will free up the memory. But this script is going to run for a long time (see 100 million files above) and I want to avoid any issues.
Other optimization suggestions are also welcome. Thanks in advance!
Edited with corrections from kennethk
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Avoiding Memory Leaks
by kennethk (Abbot) on Sep 23, 2013 at 20:37 UTC | |
by wink (Scribe) on Sep 24, 2013 at 15:18 UTC |