Every so often, Perlmonks gets a SoPW asking "Which is faster, A or B?" Invariably, a whole bunch of benchmarks are done and, 99% of the time, the difference, if any, averages to about 3 microseconds. That's 3 millionths of a second. In other words, if you did the slower operation 330,000 times, it might take an extra second. might.

I cringe every time I read one of these nodes. In 99% of the cases, CPU cycles simply don't matter, and here's why:

In other words, an hour of programmer time costs an employer between $80 and $100. This includes salary, benefits, legal costs, space, power, machines, etc. A new server costs, roughly, $8,000-$10,000 after you factor in power, sysadmin time, and cost. So, a new machine costs, roughly, 100 hours of programmer time.

Let that sink in. A new machine costs about two weeks of your time. Or, put another way, you are worth about 50 machines per year.

Anything you do needs to be measured in that light by the person paying you. Is it worth having my person spend 3 days here and 2 days there squeezing out performance? Or, is it worth just buying another machine and having my person actually spend time advancing my business? Because, there's an additional opportunity cost to squeezing performance cause you're not doing something else that actually makes money.

So, to recap - CPU cycles don't matter because the limiter isn't the CPU anymore - it's the programmer. Especially in a language like Perl.

My criteria for good software:
  1. Does it work?
  2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?