in reply to Replacing Data::Dumper / do(file) on multi-fork process

If you don't care how easy it is for other languages to be able to access the data. (I don't know how easily it is to deal with Storable frozen data outside of perl) then what I've done is create a database table (whatever db you want) with an ID field, possibly some other meta information, and then a blob that holds the result of a Storable::freeze call. You'd then be able to do something like. . .
sub SaveSession { # this could self generate the session id using # a db's autoincrament feature, or with uuids or # a different homebrewed method my $session_id = shift() my $session_struct = shift(); my $serialized_struct = Storable::freeze($session_struct); #pseudo sql insert into session_table (id,session) values ($session_id, $serialized_struct) handle errors return } sub GetSession { my $session_id = shift(); #pseudo sql select * from session_table where id = $session_id $row = sql result handle errors return Storable::thaw($row->{'session'}); }
You now have a reliable way to store and retrieve the datastructures you want so you can merege 'em together however you want them. If you didn't want to deal with a database you could replace the db calls with something that writes a file based on the session id you care about into a single directory. (waving hands about unique filenames, I'd look at uuid's to reliably name the files) You could then build a mechanisim that will open and thaw a file based on the id you pass it. Or open and thaw all files etc. Then you can do your merge in memory to display the combined results and not worry about having to continuously update the merged results.