Dearest Monks, your assistance is requested. In dealing with several large hashes for a possible up and coming feature for a
website, I need to merge a couple of large HASHes. And I'd like an efficient way to do it. Basically, I want to take two or more hashes, and quickly get back an array of items common to the two (or more). Should be easy right?
Now, I could do it iteratively, like this:
my @common;
push @common, $_ if($$bar{$_}) foreach(keys(%$foo));
Sure sure, that's great. I can grab each hash and run it through this operation. It can be optimized it by putting whichever hash has fewer keys as $foo, sure, but I really feel that I'm not taking advantage to any sort of internal organization that the hashes may have. Is there some sort of lower level operation that will give me an array (or whatever), that's common to two (or more) of them, without destroying any of the hashes.
I've looked pretty throughly on
perlfunc, and have turned up short. Is there a way to speed up this mass comparison? Would destruction of the arrays help? Thanks for your time. I hope I'm not missing anything major. Ideally, i'd love:
my @foo = commonkeys($foo, $bar, $splat, $woo, ...);
Anything you guys can think of that isn't in the vein of my current approach? Thanks for your time gentle monks. Searching for cycles,
--jaybonci