in reply to Share hash across processes

You really can't access Perl variables in shared memory directly. Systems like IPC::Shareable and memcached will always require you to send the data through Storable and create a separate copy of any data you work with in your process. There is no way around this -- at some point you have to turn the stream of bytes coming from the storage you use into a perl variable, which means creating a new one in your local process. There are faster storage mechanisms than those, like BerkeleyDB, but they all have to create a local variable when you want to do something with the data.

It is possible to load the data up and then fork the processes. It will be shared and there will be only one copy due to copy-on-write. It's read-only though.

It sounds like less than 500MB of data even if you load it separately in 10 programs. That doesn't cost much these days. I'd think twice before spending more time on this.

Replies are listed 'Best First'.
You really can't access Perl variables in shared memory directly
by bmac888 (Novice) on Jun 08, 2007 at 21:17 UTC
    Your statement "You really can't access Perl variables in shared memory directly" is what I had resolved myself after researching this and talking to a number of people over the past week. My intution told me otherwise but I thought I would throw it out this group.

    My larger issue is that I may have tens of these "groups of ten" jobs running at once so the 500MB quickly becomes 5GB - plus I pay a performance penalty.