in reply to multi-PC tasking

I'm young by most coder standards (only 25), but I remember doing punch-key coding of hex that represented assembler instructions, and toggling boot code on a PDP-11.

The amount of computing power on the average desktop never ceases to amaze me -- and neither does the mentality I see where someone who plays Solitaire, uses Word, and surfs the 'Net decides a 2GHz machine with 1GB of RAM is "too slow" and drops $2K on a new box.

At home, I have a 2.4GHz box w/1GB of RAM -- I've turned swap off, because I wasn't using it. Granted, that's Linux. But at work, I have a 600MHz box with 512MB and WinXP that I've been happily using to develop Perl (using Eclipse, no less!). Most people have far more computing power available to them then they will ever need. My PHB's PDA has a 400MHz processor -- 400MHz in a glorified address book, my GOD!.

I blame two groups: marketing and development. Marketing is responsible for what they usually do: "but, you need the newest stuff, or you won't get laid!" But "new-world" developers who learned to code at universities that used dual-Xeon boxes with 4GB of RAM don't appreciate optimization and conservation of resources. It is the "well, it will be slow on anything under 1GHz, but who uses that anymore?" attitude that drives the rush to faster, hotter, more power-hungry devices.

Now, to a certain extent, processor time is cheaper than programmer time: no one needs to optimize Word for a 33MHz machine anymore. But when a 500-user web application that merely displays rows from a (separate) database needs a quad-processor Xeon to perform acceptably, we have problems. Especially when someone else (yeah, toot, toot, it was me) can re-write the thing in a week, using an Interpreted Language (Perl) via CGI and move it to a single-processor 333MHz machine.

The big question is this: "what can we do about it?" I don't think anything, except refusing to buy/use software that is needlessly bloated. But that even excludes Gnome these days...

<-radiant.matrix->
Larry Wall is Yoda: there is no try{} (ok, except in Perl6; way to ruin a joke, Larry! ;P)
The Code that can be seen is not the true Code
"In any sufficiently large group of people, most are idiots" - Kaa's Law

Replies are listed 'Best First'.
Re^2: multi-PC tasking
by samizdat (Vicar) on Aug 26, 2005 at 16:54 UTC
    I often wonder what the impetus will be for today's CS students to grok efficiency in cycles, bandwidth, or context switching. I had to throw away (well, recycle) a whole bunch of donated P-II's because schools in the South Valley turned up their noses at them. Never mind that they rendered BSD-based Blender3D faster than P-4's on XP could refresh Corel! No, they were "old".

    Besides the laziness, there's an IT mentality (Re: On the wane?) that thrives on making out PO's for bigger iron and more Microsoft. I'm not sure if it evolved from bureaucracy or whether it's a parallel development. The CYA/job security aspect is surely evident in both. Time and again I see the same story about successful replacement using open source as you relate, and, more often than not, within six months the guy who reports it has moved on to a more stimulating job/culture.

      That's a sad comment on the state of education today. I'd have guessed it would be easy to find a bunch of teenage computer club geeks, give them a bunch of old hardware, throw in a couple books on Beowulf clusters and watch them happily build their own supercomputer...

      -xdg

      Code written by xdg and posted on PerlMonks is public domain. It is provided as is with no warranties, express or implied, of any kind. Posted code may not have been tested. Use of posted code is at your own risk.

        We do have three or four Lug/OSug variants here in central New Mexico, and there are some teens in them, but in attempting to jump-start what you suggested in the schools, I've noticed very little internal motivation in kids to put anything more complicated than a game console together. I tried teaching / volunteering in gifted classrooms for several years, and, without externally applied motivation, it fell flat as soon as the bell rang.

        I blame two things: high-bandwidth one-way entertainment and authoritarian schools. Truth is, sitting still and accepting what's offered is what's rewarded in most classrooms. Even in the gifted classrooms, most of the kids' brain cycles were spent in figuring out what behavior was going to be rewarded.
Re^2: multi-PC tasking
by Anonymous Monk on Aug 26, 2005 at 18:11 UTC
    The big question is this: "what can we do about it?" I don't think anything, except refusing to buy/use software that is needlessly bloated. But that even excludes Gnome these days...

    You can't get around a simple fact: programmer time is a resource, too.

    It takes time to optimize code for size/speed/performance, and some programmer needs to give up things they could otherwise be doing to do those optimizations.

    When the benefits of that time investment outweighted the costs, it was a good practice. Typically, those benefits were the freeing up of scare resources, like RAM, processor time, etc.

    Now that the those resources are no longer scarce, the benefits of doing all that extra optimization work become increasingly less worthwhile from a cost/benefit standpoint.

    If it costs the developer valuable time, and doesn't save anyone any time or money in return, how is it of value? In any company that values profits, it's not.

    Also: if an obvious, inefficient, dead simple brute force algorithm will do the job, and a tricky, complex, brittle, and hard to understand algorithm will do it twice as fast, you code it the brute force way. Why? Because it's good enough, and the cost of maintainer time is much more important than the fact that the program technically runs in ten miliseconds instead of only half a milisecond.

    It's just simple economics, really. Don't expend expensive resources trying to save cheap ones; do the opposite. As computing resources have typically grow exponentially cheaper, our costing priorities have had to shift to keep up.

    It's no longer worth it to spend an hour of programmer time to save a hour of computing time: because the programmer's time costs much more than computer's time. On the early mainframes, it was the exact opposite. -- AC

      Well, quite often you would not need any advanced optimization. Quite often all you'd need is someone who's been around for some time to be asked to have a glance over the code of the youngsters and tell them not to quote variables they wanna pass to a function (hey, this is Perl, not a shell script), to add a few indexes in the database here and there, to use Int instead of Numeric(9) in the database, to use this or that module instead of wasting time trying to control MS Excel or MS Word via Win32::OLE, ...

      Noone expects people to rewrite parts of their code to the assembly code to speed it up or to spend hours trying to find the most efficient way to do something to save a few cycles. Even the very basic and easy to implement things can help a lot. And all it would take is for the management to understand that time spent teaching&learning, that the time spent reviewing each others code is not wasted.

      Jenda
      XML sucks. Badly. SOAP on the other hand is the most powerfull vacuum pump ever invented.

Re^2: multi-PC tasking
by spiritway (Vicar) on Aug 27, 2005 at 21:14 UTC

    Your point is well said. Programs have grown fat and lazy because there is little incentive to optimize them. Even a badly-written implementation will seem to shine on good, fast hardware.

    I can understand this a bit in commercial shops where profits matter. Programmer time is far more expensive than CPU time and memory. Worse yet, the time needed to optimize could delay the product, allowing a competitor to beat you to market. Let's face it - most software companies don't even do a thorough job of testing and debugging their products. The attitude seems to be, "Hey, it compiles! Ship it."

    Unfortunately, it's not just market pressure that drives this bloat. We see it in Open Source programs, too. Much as I'd love to blame it all on Microsoft, it's pervasive throughout the industry.

    But I'm wondering whether this is really a Bad Thing. Yes, it goes against the grain. It bothers me that programs are bloated and sluggish - but if it's easier to just use fast hardware to compensate, does it really make much difference? Isn't that just using the resources in the most economical way? I don't know.

    So my question is, why should we optimize, when that's so much more expensive than just using faster machines?

      So my question is, why should we optimize, when that's so much more expensive than just using faster machines?

      All I can offer is my personal take on the matter. There are a few reasons, IMO, that optimizing is preferable to "just using faster machines":

      Pride in quality. I think one of the values that our society is slowly losing is the pride in creating something of quality for its own sake. Quality workmanship ends up priced out of the range of most people, and the rush to commodotize and profit (and to consume, on the other end) creates an environment where shininess outweighs quality. That trend will someday lead to the commoditization of development (it's already happening in some places), and the lack of demand for developers capable of quality. All that means is that I will command a lower salary.

      Economic enlightenment. It's all well and good to target deep pockets. On the other hand, selling lots of something at a low price has profit potential as well -- not everyone can afford the latest, greatest machine, especially in developing nations. I'd love to see any major software company that's trying to build markets in developing countries explain why they're not working on making sure their products perform acceptably on the machines that people can actually get there (e.g. P-II class machines).

      Environmental awareness. Unfortunately, there are a lot of hazardous materials that go into the manufacture of computer equipment; a good chunk of them stay in the machine and live in your home. That's all fine, but disposal is an issue -- and recycling these materials isn't the ultimate solution, because that process in itself creates hazardous wastes (not as much, though: recycle if you can). Why should I be forced to get rid of a 900MHz machine that I know is capable of running the type of apps my employer uses simply because the people who developed the apps were careless?

      Granted, there are things we can do to mitigate these issues, and I do encourage them. For example, that "old" 900MHz machine might find its way to a high-availability web cluster or to the test lab for the integrators to poke at. But, I still think that development organizations have a degree of responsibility to be reasonably aware and careful -- to hire programmers who can (and encourge them to) think ahead and be reasonably conservative about resource use.

      And, to beat this dead horse a little harder, I remind you that I'm not necessarily talking about optimizing or refactoring: just about thinking ahead and avoiding needless resource use.

      <-radiant.matrix->
      Larry Wall is Yoda: there is no try{} (ok, except in Perl6; way to ruin a joke, Larry! ;P)
      The Code that can be seen is not the true Code
      "In any sufficiently large group of people, most are idiots" - Kaa's Law