So my question is, why should we optimize, when that's so much more expensive than just using faster machines?
All I can offer is my personal take on the matter. There are a few reasons, IMO, that optimizing is preferable to "just using faster machines":
Pride in quality. I think one of the values that our society is slowly losing is the pride in creating something of quality for its own sake. Quality workmanship ends up priced out of the range of most people, and the rush to commodotize and profit (and to consume, on the other end) creates an environment where shininess outweighs quality. That trend will someday lead to the commoditization of development (it's already happening in some places), and the lack of demand for developers capable of quality. All that means is that I will command a lower salary.
Economic enlightenment. It's all well and good to target deep pockets. On the other hand, selling lots of something at a low price has profit potential as well -- not everyone can afford the latest, greatest machine, especially in developing nations. I'd love to see any major software company that's trying to build markets in developing countries explain why they're not working on making sure their products perform acceptably on the machines that people can actually get there (e.g. P-II class machines).
Environmental awareness. Unfortunately, there are a lot of hazardous materials that go into the manufacture of computer equipment; a good chunk of them stay in the machine and live in your home. That's all fine, but disposal is an issue -- and recycling these materials isn't the ultimate solution, because that process in itself creates hazardous wastes (not as much, though: recycle if you can). Why should I be forced to get rid of a 900MHz machine that I know is capable of running the type of apps my employer uses simply because the people who developed the apps were careless?
Granted, there are things we can do to mitigate these issues, and I do encourage them. For example, that "old" 900MHz machine might find its way to a high-availability web cluster or to the test lab for the integrators to poke at. But, I still think that development organizations have a degree of responsibility to be reasonably aware and careful -- to hire programmers who can (and encourge them to) think ahead and be reasonably conservative about resource use.
And, to beat this dead horse a little harder, I remind you that I'm not necessarily talking about optimizing or refactoring: just about thinking ahead and avoiding needless resource use.
In reply to Re^3: multi-PC tasking
by radiantmatrix
in thread multi-PC tasking
by samizdat
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |