Of course. Some you didn't mention are:
- Power costs
- disaster recovery planning
- load balancers
- proxy servers
- a SAN/NAS (and its backup)
- Additional internal gigabit networking
- retooling apps to live on multiple servers vs. just one
- planning and handling failover between servers
The point is that, in general, the TCO of a new server tends to average between 5 and 7 days of developer time. That's, roughly, $4000-$6000. (Yes, a good developer will have an average TCO of $800-$850/day.) That gets you a nice server-class dual-dual CPU, 2G RAM, a decent set of disks, and 2 gigabit NICs. As the server numbers drop and as clustering technologies improve, those hardware numbers keep going down. The cost of that (good) developer is only going to go up.
My criteria for good software:
- Does it work?
- Can someone else come in, make a change, and be reasonably certain no bugs were introduced?