cosmicperl has asked for the wisdom of the Perl Monks concerning the following question:

Hi All,
  I've almost finished my new config module (well the working parts anyway). Got it generating the HTML, forms, verifying, saving the new variable config to the files with data dumper so that they just need to be required in. Saving special config to anonymous subroutines in a dispatch table. It's been a nightmare, but it's all looking pretty much they way I wanted it to, except for now I've hit the final hurdle and it seems I have 2 options.
  The problem is that the old system I used wrote the variable files with a nasty bit of code that had it all spelled out such as:-
$::Config->{group1}->{opt1} = 'A'; $::Config->{group1}->{opt2} = 'B'; $::Config->{group1}->{opt3} = 'C'; $::Config->{group2}->{opt1} = 'D';
So If I included another variable file later it would just replace and doubling up of variables which I what I want:-
$::Config->{group1}->{opt3} = 'C2'; $::Config->{group2}->{opt2} = 'E';
The problem is now that I'm using data dumper, the variables files are looking like:-
$::Config = { group1 => { opt1 => 'A', opt2 => 'B', opt3 => 'C', }, group2 => { opt1 => 'D', }, };
And:-
$::Config = { group1 => { opt3 => 'C2', }, group2 => { opt1 => 'E', }, };
So if I load the second file after the first it's loosing data rather than updating the hash it's replacing it.

So as far as I can see my options are:-
A) Get Data::Dumper to dump in the old format, problem is that after looking at the examples I can't see a way of doing this. So it could mean dropping Data::Dumper and writing my own dumper...

B) Rename the first config hash then merge it with the new one using Hash::Merge. Problem with this is that it's going to be way less efficient and slower than a).
Is there a simpler solution that I'm missing?

Thanks in advance

Lyle

UPDATE: I've been trying the merge route as I thought it'd be quicker. Problem is that Hash::Merge doesn't merge array references, just hashes, so I hacked it to merge arrays better. Testing it now but I'm not sure if this method is going to be more trouble than it's worth and I'll end up writing a new dumper :(

Replies are listed 'Best First'.
Re: Last hurdle.. Merge hash refs or change data dump?
by moritz (Cardinal) on Apr 16, 2008 at 13:04 UTC
    Did you even measure how much time you lose with Hash::Merge?

    I don't expect it to be a big performance hit, or to even be measurable compared to normal startup time and noise.

      Problem is in my scenario the config files are read in on some click tracking that needs to be able to handle hundreds or thousands (or even more) of clicks per second. So even the slightest performance drop will scale up to having a big effect.
        But these micro-optimzations are the wrong solution.

        Better use something like mod_perl or fastcgi where you have a persistent program in memory, so you don't have to deal with module loading and other startup costs for every hit.