in reply to Optimisation isn't a dirty word.

One case where people caution against optimization is optimization that causes extreme obfuscation. There are many key numbers you need to track when considering optimization: cost of coder time, cost of new hardware, time being spent on certain parts of a program, time originally coding vs. maintenance, actual time seen by a user, etc.

What people caution against is a knee-jerk type of optimization where a coder dives into an algorithm and emerges with a few more micro-seconds and a blob of horribly unmaintainable code.

Rather, code it cleanly and correctly from the start. If it's too slow (quantify and measure that), consider faster hardware first, it's usually cheaper than coder time over the long run. If your hardware is already good, identify the source of the majority of the slowness. Once you have metrics that point at your target, optimize that code with appropriate commenting to explain what you are doing and why. Re-test and verify you've made it fast enough.

So optimization isn't bad, it's just not the first thing you should jump on when something seems slow.

Replies are listed 'Best First'.
Re^2: Optimisation isn't a dirty word.
by Perl Mouse (Chaplain) on Oct 25, 2005 at 16:16 UTC
    If it's too slow (quantify and measure that), consider faster hardware first, it's usually cheaper than coder time over the long run.
    That I disagree with on multiple levels. Sure, if you as a developer only write one program, which noone wants so it's running just on your box, you are right. It's cheaper to upgrade your hardware than spend time optimizing your program.

    But if your program actually gets run on several thousand of different boxes, it become a different matter. Upgrading 2,000 boxes for $500 each means spending a million dollars. That's a few programmer years. And if you produce a program once a month, and each time you're investing in faster hardware, you'll run into limits (the bottom of your wallet, or the state of the industry) pretty fast as well.

    Perl --((8:>*
      But if your program actually gets run on several thousand of different boxes, it become a different matter

      Sure, but 90% of software development is done as some for of in-house customization work. Less than 10% is coding software as an industry commodity. Very, very, very few apps will ever make it onto hundreds, let alone thousands, of different boxes. Of those, the hardware upgrade benefits will in some cases be justified. Your point attacks a straw man; the common case is so far removed from your argument as to be entirely inapplicable.

      And if you produce a program once a month, and each time you're investing in faster hardware

      Almost no one produces a program "once a month", nor invests in faster hardware "once a month". It takes at least three months just for the average department to decide upon what it wants, let alone document those wishes clearly enough for them to be written into a program of noticable complexity. Again, you're vigourously assaulting a non-issue.

      Upgrading 2,000 boxes for $500 each means spending a million dollars. That's a few programmer years

      Well, at my company, we wasted several programmers years trying to do an in-house optimization as part of a code cleanup. It didn't work; the designer of the new system was incompetant. The new code is as ugly as the old; and the "optimizations" slowed things down so badly that we bought new hardware to solve the problem. How did we eventually triple the throughput of our program? We ran four copies in parallel on a quad-CPU machine. ;-)

      In many places, putting just two contract programmers on a project for three weeks costs the company about $10,000. Buying $10,000 worth of hardware is probably going to give you a faster running program, sooner, for the same cost to the company. If the program gets scrapped by upper management, the company can still make use of the hardware; it's a generally valuable asset. Not so for the optimized code; it's a strictly limited use asset. Unlike the optimization effort, the hardware optimization is very unlikely to introduce bugs into the software. Unless you're buying a lot of hardware, (and most companies aren't) it's just no contest.

      So, unless you work for Microsoft, or some other software vendor, you don't need to optimize code as badly as you need it to be bug free. Fast hardware is cheap. Fast software is expensive.

        we wasted several programmers years trying to do an in-house optimization as part of a code cleanup.

        Seems to me, part of the problem is right there. If you're doing code clean-up, you're attempting to make your code more maintainable, readable, modularized etc. If you're doing optimization, you're attempting to make it more efficient. Doing both at the same time is inordinately difficult and makes failure at both more likely. I'd posit that even if you're a wizard who knows every nook and cranny of perl and can mentally juggle thousands of lines of code, you're probably better off doing clean-up and optimization in several alternating passes (I'm certainly no such wizard, so this is pure speculation on my part, but it makes sense).

        This is also where I have most problems with the OP. While I agree that performance optimization is often important and probably does not get enough good press these days, I think it is dangerous to want to do too much of it too soon. It's hard enough coming up with a program design which is flexible yet simple. Strongly leaning towards optimization at the same time makes things infinitely harder and there aren't many people who can pull it off well. Also, I think it'll always be easier to refactor maintainable code for optimization than it will be to turn optimized but messy code into something another mortal can understand and maintain.

      That I disagree with on multiple levels.

      Actually, I think your argument agrees with what I said. I said

      consider hardware first

      In the example you provided, you considered hardware and concluded that it wasn't practical to purchase new hardware to solve the problem. My suggestion was to at least evaluate hardware improvements as an option before you throw coder time at the problem. Coder time can sometimes seem "free" since they get paid every two weeks anyway, but there is still a cost to someone doing work.

      So I agree with your assessment for cases where a hardware upgrade is prohibitively expensive.