Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I have a simplified example of a problem I'm trying to solve (below). Up until recently, I have had much success with the IPC::Shareable module for sharing variables across parallel forks.

Unfortunately, when I use a hash of hashes with IPC::Shareable, I get orphaned shared memory segments. If I remove the line that says $hash{'test'}{'test'} = 'test', then the script runs without orphaning memory... The maintainer of the module no longer lives at the listed email address, and if I could get a hash of hashes to share across parallel forks, it would reduce my run-times dramatically...

Any thoughts for another way short of Open2, udp datagrams, or any other such ugliness?

See the man pages for 'ipcs' and 'ipcrm' for more info.

Broken code below:
---

!/usr/bin/perl -w use lib 'site_perl'; use IPC::Shareable(':all'); use Parallel::ForkManager; my $forkman = new Parallel::ForkManager(1); my $key; my %hash; my %options = ( create => 'yes', mode => 0644, exclusive => 0, destroy => 'yes' ); my $index; my $HANDLE = tie %hash, 'IPC::Shareable', 'data', { %options }; for($index=1; $index <= 7; $index++) { print "Spawning process: $index\n"; $forkman->start and next; do_this($index); $forkman->finish; } $forkman->wait_all_children; print "All children are done\n"; foreach $key (sort keys %hash) { print "PID of process $key = $hash{$key}\n"; } sub do_this { my $index = $_[0]; $HANDLE->shlock(); $hash{$index} = "PID: $$"; $hash{'test'}{'test'} = 'test'; $HANDLE->shunlock(); return 0; }

Replies are listed 'Best First'.
Re: Hash of Hashes Shared Across Parallel Forks Orphans Memory
by perrin (Chancellor) on Nov 05, 2003 at 22:26 UTC
    Change that to
    $temp = $hash{'test'}; $temp->{'test'} = 'test'; $hash{'test'} = $temp;
    Nothing special here, just the usual issues with tie.