in reply to AUTOLOAD for variables?

When you access a variable that does not exist perl will AUTOVIVIFY it (ie create it for you). Although TheDamian was doing some work on an AUTOVIVIFY sub (equivalent to AUTOLOAD but for vars) AFAIK you can't currently catch access to undefined vars.

It sounds like you have some significant design issues. Perl code for config files + lots of users is an *interesting* (scary) concept. What is to stop someone inserting arbitrary code that will then run with the perms of you master script?

Your explanation of the problem you are trying to solve may make sense to you but is far from clear at this end. You have a performance problem and you say it is loading the config files. Are you sure? What is actually in these files? Why do you think delaying loading will speed things up? Is this a CGI? Are you using mod_perl? ?????

cheers

tachyon

Replies are listed 'Best First'.
Re^2: AUTOLOAD for variables?
by dpuu (Chaplain) on Jul 08, 2004 at 04:19 UTC
    I was trying not to get into too many details -- they aren't terribly important -- except for finding alternative solutions that don't involve the AUTOVIVIFY method.

    Yes, the loads of users of perl-config files is kinda scary -- especially when you see how some people take advantage of having perl. The context has nothing to do with web. It's actually the test environment for our hardware (chip) development. People configure the test scripts by grabing configuration from a database of chip-specific information -- and that database has grown somewhat big over the years (along with the number of scripts that access it).

    You are right that there are some serious design issues -- the sort that take a fair amount of time to sort out. We basically have to trawl through a deeply tested directory structure, over nfs, to find about 50K files in the leaves -- then parse those files into a flat namespace for the config file (its not a real database -- which may be part of the problem. Its not quite as bad as it sounds, because we can reduce the scanning with some caching) Then a config file uses only a small number of the values we supply. Multiply this onto a farm with multiple thousand CPUs, and soon you're talking real money!

    It really is tremendously ugly (and it may be impossible to reverse-engineer a spec). I was hoping to find a quick fix to use as a stop-gap measure while sorting out the real problems. Perhaps if there is no quick fix, then there's a greater incentive to apply the resources needed to clean it up. I guess this is what happens when a startup becomes big, quickly.

    --Dave
    Opinions my own; statements of fact may be in error, or may be deliberately obscured.

      flat namespace for the config file (its not a real database -- which may be part of the problem

      It really depends on how it actually works in practice but it sounds like your end data structure could work better. Depending on all sorts of factors an RDBMS or using a flatfile that stores a hash data structure, serialised with say Storable may help. With the Storable suggestion you would parse the data into a hash (very fast access) then store it into a flatfile with Storable. Anything that needs access just loads the file and gets the whole config hash ready to go. The net result is you effectively spend memory to gain speed.

      cheers

      tachyon