I am gathering data to a hash and encounter the perl memory freeup issue (memory never return to the system unless I reboot it). I thought that I would use module storable and break down the data collection to steps, then store each collected data at step into local disk to minimize the issue. but that doesn't seems to get any better.
use File::Find; use Storable qw(store retrieve); my $mbox = {} foreach my $usr (@usr_dirs) { chomp $usr; my $usr_dir = $base_dir."/".$usr; undef $mbox; $mbox = retrieve($storable_file) if -e $storable_file; find(\&wanted,$usr_dir); store $mbox, $storable_file or die "Can't store !\n"; }
the find subroutine does all the data collection. before coming up with the above code, I stored all data into $mbox hash once then saved it to the local disk, the memory usage is really high. I was hoping by breaking it down by storing / retriving / storing , the memory usage issue could be resolved.
also, If i am not mistaken, once $mbox gets undef, the memory was allocated by it goes back to the memory pool. so shouldn't the overall maximum memory usage = max(batch_1_mem_usage,batch_2_mem_usage, batch_n_mem_usage) ?
In reply to hash and memory usage by Qiang
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |