in reply to speed factor

The cost of a new webapp machine (including all provisioning, bandwidth, sysadmin costs, etc) is generally the cost of 1 week of developer time. The cost of a new database machine (about 4x as powerful) is generally 2 weeks of developer time. If I can write something in 1 week (in Perl) and it requires 2 machines or take 4 weeks (in C++) and it requires 1, I have saved two weeks of my salary. This assumes that the C++ version even does everything the Perl one does and that the C++ one requires half the resources of the Perl one. (Both have proven to be faulty assumptions).

Furthermore, the Perl version is going to be easier and safer to extend than the C++ version. New developers are going to be productive more quickly. And, there is more battle-tested code for Perl (via CPAN) than there is for C++.

The key is determining where your costs are. 30 years ago, the significant cost was hardware, so you optimized for hardware speed. Today, the significant cost is developer time, so you optimize for developer speed. This is the origin of "throwing hardware" at a problem. It's usually the right business decision.


My criteria for good software:
  1. Does it work?
  2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?

Replies are listed 'Best First'.
Re^2: speed factor
by Anonymous Monk on Nov 14, 2007 at 01:53 UTC
    There are tons of hidden costs in those extra servers of course, such as "we'll need another server room"... Plus, of course, every new box adds a maintenance burdon.
      Of course. Some you didn't mention are:
      • Power costs
      • disaster recovery planning
      • load balancers
      • proxy servers
      • a SAN/NAS (and its backup)
      • Additional internal gigabit networking
      • retooling apps to live on multiple servers vs. just one
      • planning and handling failover between servers
      The point is that, in general, the TCO of a new server tends to average between 5 and 7 days of developer time. That's, roughly, $4000-$6000. (Yes, a good developer will have an average TCO of $800-$850/day.) That gets you a nice server-class dual-dual CPU, 2G RAM, a decent set of disks, and 2 gigabit NICs. As the server numbers drop and as clustering technologies improve, those hardware numbers keep going down. The cost of that (good) developer is only going to go up.

      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?