in reply to Re^2: Spreadsheets, HTTP::Recorder and interactivity
in thread Spreadsheets, HTTP::Recorder and interactivity

Two comments immediately come to mind.

First, this programming technique is indeed a fast way to solve certain kinds of repeated interaction problems, but leads to fragile systems. They tend to be poorly laid out, hard to understand, have lots of things that can go wrong that aren't handled, and so on. You should recognize this description if you've ever dealt with critical business processes that got started as a glorified spreadsheet from a non-programmer. Those drawbacks aren't a big problem for testing suites. But they are if you're maintaining production systems.

Second, my personal reaction to anyone who is excitedly talks about the next big paradigm shift is to label them as someone who likely doesn't know what they are talking about, and almost certainly doesn't understand Kuhn. Put simply, a paradigm worth discarding at the drop of a hat wasn't worth owning, and the paradigm that you discarded the first one for probably isn't either. Paradigm shifts are not good things, they're very expensive messes that sometimes become unavoidable. The value that is perceived in them could not arise but for the far less visible effects of people working within good paradigms before and after the shift.

If the second comment puzzles you, read The Structure of Scientific Revolutions by Thomas Kuhn. Again if necessary.

  • Comment on Re^3: Spreadsheets, HTTP::Recorder and interactivity

Replies are listed 'Best First'.
Re^4: Spreadsheets, HTTP::Recorder and interactivity
by zby (Vicar) on Jul 01, 2004 at 16:46 UTC
    To the first I answer Worse Is Better. You might think you can design a system top down - but in reality successfull systems are usually built incrementally. It's all about short feedback loop. And you can refactor it later if it really builds up to something big.

    To the secont I've allready appologised for using a bit grandiose words. Should I delete them? I think not it would make your comment out in void and at least they are not misleading like a wrong code.

    Update: Is the meaning of term 'paradigm shift' restricted to what Kuhn related by this term? I am not sure. Look for example at ParadigmShift for a more liberal definition.

      And you can refactor it later if it really builds up to something big.

      Oh, you've hit a pet peeve of mine. I am working at a company that had revenues of roughly $65 million dollars last year. We have grown revenue an average of 72% year on year for the past 7 years. We are looking at a 125% increase in revenue this year, assuming that one third of the projected contracts are actually signed. It could be as high as 200%. This means that 7 years ago, we had revenues of 1.5 million dollars. In 8 years, we have increased 100x. Our IT staff has gone from 2 people to 22 people. Our total staff has gone from 20 to 250.

      What does this have to do with anything, you might ask. It's very simple - our systems have grown incrementally with an extremely short feedback loop. And, they are best described as cancers or weeds. The system is a mess and it is more dangerous to modify it than it is to throw out and start over from scratch. Except, refactoring now would cost $500,000 and take a year. Ooops!

      Remember - incremental design does not in any way imply successful design. In fact, it almost never does. You end up traversing the twisty maze of:

      • Poor requirements (assuming you even got any)
      • Inadequate development time (assuming you even got a fixed deadline)
      • Zero testing (TIP/DIP, anyone?)
      • Inappropriate executive-level decisions
      • The wrong people in the wrong positions
      • Complete lack of experience in fundamental roles

      You are right - a hallmark of most successful modern applications, proprietary software aside, tends to be incremental development. It is a necessary condition, but it most certainly is not a sufficient condition. There are at least three other conditions I can think of that are required that incremental development doesn't even begin to touch.

      • Clear and concise development process which includes peer review at every stage and a strong focus on repeatable testing
      • Clear and concise requirements
      • Experienced management that understands the division between their roles and the roles of their staff

      Give me a project with those three conditions and I'll be happy to work top-down. Give me a project whose only bright point is incremental development and I'll quit in less than 3 months.

      Another personal story - I worked at an e-commerce firm which had a continuously incremental development cycle.

      • We'd release on Sunday night.
      • Monday, we'd receive requirements.
      • We would have 27 days to complete development on those requirements.
      • We released four weeks from the prior release.

      Yet, we had no test suites, no design, no peer review ... no nothing. When I asked about design, my director said "Do it on your own time." But, we had incremental development, so we should've been ok, right?

      ------
      We are the carpenters and bricklayers of the Information Age.

      Then there are Damian modules.... *sigh* ... that's not about being less-lazy -- that's about being on some really good drugs -- you know, there is no spoon. - flyingmoose

      I shouldn't have to say this, but any code, unless otherwise stated, is untested

      Worse Is Better is not a pancea meant to justify every bad development process. Likewise you're drawing a false dichotomy when you portray this as a discussion of incremental vs waterfall development models - with me as the sclerotic dinosaur who doesn't yet know that he's dead.

      The key understanding that Worse Is Better is groping for is that as you change which set of normative standards you use to judge something, what is worse according to one reasonable standard is better in another. Richard Gabriel gropes after this idea by comparing two sets of competing norms that are relevant to software development.

      So yes, what is worse by one standard can be better according to another. Know the advantages and disadvantages of the approaches you are thinking of, know your goals, and then react accordingly. You stated the advantages of the "record a macro" approach to development. I've let you know the big disadvantages, and gave an example which is very similar to what happens. The disadvantages basically boil down to it being a fast road to having a big ball of mud. I gave a case where those disadvantages are not a problem, and a case where they are. I think that this is useful knowledge for you to have when you decide where to use this (new to you) development practice.

      Given that I've stated the disadvantages of the approach, I probably should spare a couple of words on why it works out that way.

      Well-organized systems always have some kind of implicit internal theory of operation. There is some sense about what things are to be found where, why they are there, what kinds of internal divisions to look for and maintain. This theory need not be explicitly stated, but it is understood by the people who are proficient with that system. If you've experienced this, then you know what I'm talking about. If you haven't, then Peter Naur's (who is the "N" in "BNF") excellent essay Programming as Theory Building might help. Unfortunately I don't know of copies on the web, I read it in Agile Software Development.

      Note that this organization does not have to be there from the start. There are many ways to emerge an organizational structure iteratively - XP people are big on trying to do that.

      However this kind of organization needs a lot of contextual information to find its way into the development process. Similar things go together. Information about why things are done in a particular way circulate among the developers. People spend time thinking about how things should be done.

      Opportunities to include that kind of contextual information are sorely missing from the "record a macro" process. You do X then Y. No information is captured about what the person was thinking when they did X. If you think you see something wrong, X should really have been X', there is no context to use to figure that out. The code you get is not factored in any way, shape or form. While the person who recorded the macro may have had plenty of knowledge about what to do with various exceptional conditions, none of that knowledge is captured in the code.

      The result is exactly as I said.

      As for my comments on people talking about paradigm shifts, I meant that to be useful advice as well. It was criticism, sure, but it was meant as constructive criticism. Let me explain how.

      First of all I was doing you the kindness of letting you know how I (and presumably other knowledgable people) are likely to react when you run around waving the words "paradigm shift". Which hopefully will make you a little more careful of those words in the future.

      Furthermore I tried to explain why I react that way. That explanation attempted to summarize the most widely missed point in a classic work on the progress of our understanding. If you grasp that point, then you'll understand something key about learning. Perhaps you'll even come to share my skeptical response to people who begin extolling the benefits of joining in the latest "paradigm shift".

      I apologize that my lapsing into a form of terse information presentation apparently resulted in your misunderstanding my intentions. All that I can do is clarify what my intentions were and hope that they make sense to you in retrospect.

      UPDATE: By schlerotic I meant sclerotic. Fixed.

        At least my comment had the possitive effect of inducing this detailed analyzis of what is missing in the 'recorded macro' method. So now we have 'better interactivity' contra 'lost context'.

        By the way count the 'paradigm shift' for my lazyness to come with a better word.