http://qs1969.pair.com?node_id=63080


in reply to Benefits of the Specification

The really Bad Thing about not thinking about the application, is that every time you add something, something else will inevitably break. It's like those squishy tubes that you sqeeze and they fly out of your hand...an endless process of adding in kludges.

h0mee and I were driving around one night discussing breakage, and came to the conclusion that It's All About The Data. Knowing your data will be the difference between breakage, non-breakage; a maintenence nightmare, and a maintenence pleasure. For example, in Programming Pearls, in the first column, Bentley (the author) discusses an old school problem of sorting thousands of 1-800 numbers using about a megabyte of memory. They ran into some Really Bad time-space issues with certain algorithms, so they used a bitmap vector to map the data. It was a novel solution because it was engineered to the data. The problem is maintaning a symbiotic relationship between data and algorithm whilst preventing breakage. Verily, this is very difficult.

redmist
Silicon Cowboy
  • Comment on (redmist) Re: Benefits of the Specification

Replies are listed 'Best First'.
Re: (redmist) Re: Benefits of the Specification
by dws (Chancellor) on Mar 08, 2001 at 23:56 UTC
    The really Bad Thing about not thinking about the application, is that every time you add something, something else will inevitably break.

    Until recently, this has been predominant view: If you don't Plan Ahead, things will break later.

    Extreme Programming offers a novel counter proposal: If you only think one step ahead, and at the end of each step you have a well-tested, well-factored system, then the chances of your breaking something on the next, not-yet-thought-of step are reduced to near zero. Well-factored implies good structure, with things done "once and only once". Then means that many decisions are well localized. A well localized decision is much easier to safely change or replace than once that's becomes spread out over time.

      The XP point of view sounds good too, but I don't think that the two philosophies are oppposed to each other. It sounds to me that they both emphasize planning and thinking about the project before something is implemented, and one way that XP differs is that XP will plan in smaller chunks. Correct me if I am wrong.

      redmist
      Silicon Cowboy
      UPDATE Thanks for the clarification dws.
        In both the traditional model and in XP, you might be thinking ahead strategically. They part company when it comes to tactics. The traditional model requires what XP calls Big Design Up Front, which involves planning out design and construction well ahead of time. XP reduces tactical scope by restricting design and implementation to one "feature" (user story) at a time.

        In the traditional model, you (try to) avoid coding conflicts by planning ahead. In XP, you avoid conflicts by taking one step at a time, having a full set of regression tests so that you prove on short notice that your system is in a working state, and refactoring before you're done with the step.

        There's more parallelism in the traditional model, with testing deferred until the system is once again back into a working state. XP tests all the way along, keeping the system in a working state between short steps.

        Clarification: In the traditional model, the phases of development happen in serial, while implementation activities happen in parallel. In XP, phases happen more in parallel, with implementation happening more in serial.

        Thanks to tilly for indirectly pointing the need for a clarification.