in reply to Re: Last hurdle.. Merge hash refs or change data dump?
in thread Last hurdle.. Merge hash refs or change data dump?

Problem is in my scenario the config files are read in on some click tracking that needs to be able to handle hundreds or thousands (or even more) of clicks per second. So even the slightest performance drop will scale up to having a big effect.
  • Comment on Re^2: Last hurdle.. Merge hash refs or change data dump?

Replies are listed 'Best First'.
Re^3: Last hurdle.. Merge hash refs or change data dump?
by moritz (Cardinal) on Apr 16, 2008 at 13:35 UTC
    But these micro-optimzations are the wrong solution.

    Better use something like mod_perl or fastcgi where you have a persistent program in memory, so you don't have to deal with module loading and other startup costs for every hit.

      I'll be implementing a FastCGI version and I've been helping the FastCGI team out with their site. But for the time being I've been working hard on all the micro optimizations and it's been paying off.
        But for the time being I've been working hard on all the micro optimizations and it's been paying off.

        In the short term it often does. In the long term, you lose so much in opportunity costs that you might as well not even bother.