in reply to Optimizing into the Weird Zone

For a long time, I've been of the opinion that after a certain point, optimisation becomes counter-productive. It happens on two fronts. One, you are spending exponential time for linear (if you're lucky) gains in performance. On the other hand, you are making your system overly sensetive to perturbations in the initial assumptions. And all for what? So your system will run in 5 seconds instead of 10. For me, I'll take the less optomized, more stable solution every time. It makes my life easier, but more importantly, makes others' lives easier when they have to maintain anything that I've written.

thor

Replies are listed 'Best First'.
Re: Re: Optimizing into the Weird Zone
by husker (Chaplain) on Aug 12, 2003 at 15:53 UTC
    Companies who spend millions of dollars on transaction systems, and those systems are at the core of processes which generate revenue, DO CARE that it takes 5 seconds instead of 10. And that's why chip people will go to great lengths to optimize chips. Because they sell better than chips that aren't.

    As for programming optimizations ... Maybe you spend 40 work hours squeezing a 3% performance gain. Now if your system is being used by a lot of people, and for something "important", you're going to leverage that 3% with every user who saves that time. Given a decent user base, you're going to get that 40 hours back pretty quick.

    People who are trying to make more money by using your software are very concerned that your program does it's task correctly in the shortest time possible.

      You spend countless hours and untold amounts of money on optimisations, only to find out later that the users go out to have a cup of coffee during that long job and their computers happily spend gazillions of fully optimised CPU-cycles waiting for the users to return and hit the 'ENTER' key.

      CountZero

      "If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

      As I implied before, there's optimizing, and there's optimizing. I'm all for making code run fast. However, there comes a point where you have to consider cost utility. I mean, there's a reason why we don't write everything in assembly code, no? :)

      thor

      I have to agree with husker in one point. Take a bank at the end of the year, 24 hours are short if you have some long running batch jobs, which take some hours. They have to be finished, before the regular jobs will start again. A million times tenth of a second is a difference. And - after all - CPU time costs on a mainframe.

      But as I said another time in Structure is more important than speed, that's just a small part of the development business. Most times it is not worth to spend a single minute for micro-optimization.

      And even for a bank it is more important, to have correct and not fast values in the end. They do not like optimized code, where nobody can fix that special case, that occured for the first time.

      And it came to pass that in time the Great God Om spake unto Brutha, the Chosen One: "Psst!"
      (Terry Pratchett, Small Gods)