in reply to (OT): 200-year software

The author, Dan Bricklin, is quite a pioneer in computing. He's one of the saner voices out there.

However, I am of the opinion that his article is nothing but a pipe dream. Analogies between software "engineering" and civil engineering are enlightening, but mainly only enlighten us as to how different building a brick wall is with writing a firewall. Software firewalls are more complicated than brick walls, not only when considered as single entity (a "wall), but when viewed at the component level.

Brick wall: composed of multiple, functionally identical bricks. Bricks are interchangable. Must be properly designed by a qualified engineer, but can be documented in often one or two relatively simple blueprints. Can be assembled according to blueprint by adequately trained bricklayers. Such training can be accomplished in a matter of weeks to obtain basic competence. Relevant properties of the bricks, mortar, and wall (dimensions, location, strength, etc) can be easily measured. Failed brick walls leave physical evidence which can be examined and analyzed to determine cause of failure. Walls are built in the physical world, which is like an OS with a non-changing spec: gravity, mass, intertia, momentum, etc, are all constants. The tools used to build the walls are also fairly simple, easy to operate, and well-understood.

software firewall: Made of numerous unique bricks (software components or lines of code). Not only is each brick different, we have to make our own bricks, and each brick has it's own unique failure mode. Each individual "brick" must be tested. Bricks are not interchangable. Built in a "virtual world" defined by an operating system or a hardware system, which themselves change frequently and exhibit unpredictable or anamolous behavior on occasion. Documents which describe the wall are complex, and the wall cannot typically be built from these documents by someone who doesn't also possess intimate knowledge concerning walls, bricks, and the "world".

To me, the biggest problem is that of the "world". For brick walls, we have a consistent, predictable system on which we build brick walls. A brick wall built 100 years ago works on the same physical principles as a brick wall built today, or 1000 years ago, because the world of physics itself has not changed over that time. The software world, however, didn't even exist 100 years ago and is basically impervious to our attempts to guess what it will be like 20 years from now. Whatever we "know" about the "world" now is likely going to be obsolete and worthless in the medium, if not near, future. We cannot develop "best practices" because only experience teaches which practices are best, and since everything in computing changes so fast, there's no opportunity to gain that kind of experience.

Dan has it right ... to make software work like brick walls, the things in his article need to happen. I just am doubtful that those things could actually happen.

Replies are listed 'Best First'.
Re^2: (OT): 200-year software
by BrowserUk (Patriarch) on Jul 15, 2004 at 15:56 UTC

    Husker++. I really like the "world as an invarient OS". One additional thought came to mind as I read that.

    When "bricks" were made by hand, whether they were the individual blocks in the pyramids 4500 years ago, or those that went into the Great Wall of China. Construction took a very long time.

    Modern bricks are machine made. Construction is much quicker.

    The software industry is still very young. We still haven't worked out how to make our bricks by machine.

    The electronics industry is just a few years older, but is considerably further along the evolutionary path. From hand blow valves; through individually soldered transistors; discrete ICs; large scale, then very large scale integration.

    1. Assembler / C ~= transitors
    2. C++ / Java / Smalltalk ~= discrete ICs
    3. (Elements of) Perl SQL (others?) ~= LSI
    4. ??? => VLSI

    Still a ways to go yet.

    Unfortunately, we are still making the tools we use to build our programs, by hand. We need to get over the hump to the point where we can use tools to build our tools.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "Think for yourself!" - Abigail
    "Memory, processor, disk in that order on the hardware side. Algorithm, algoritm, algorithm on the code side." - tachyon
      We need to get over the hump to the point where we can use tools to build our tools.

      One of the main problems here is that software evolves so fast, that tools based on languages in use today, are nearly worthless very quickly. If there were a very good tool for creating complex software, but it was based on Cobol, very few programmers would use it.
      Even so, if someone could create such a software "brick maker" it might extend the life of the language used to build it.

      -Theo-
      (so many nodes and so little time ... )

        I know many a paralegal who is absolutely satisfied with WordPerfect 5 on an i386 (repaired several times over in most cases). I'm not convinced that it's the evolution of software or hardware that prevents stabilization or longevity of applications. Rather, it's end-user/consumer's evolving expectations that are driving the phenomena.

        How has the end-user's expectation of a brick changed in the last 2500 years?

        --Solo
        --
        You said you wanted to be around when I made a mistake; well, this could be it, sweetheart.
Re^2: (OT): 200-year software
by JanneVee (Friar) on Jul 15, 2004 at 16:22 UTC
    I do agree with you but, somethings has not changed with computers, so some knowledge about computers is not obsoleted. One of these things that are not obsolete like imperative programming, in a von Neumann machine. And I do suspect that this is not going to change in the fifty years to come. Or to say it in another way, the details of computing do change alot but the broader picture does not change.
Re^2: (OT): 200-year software
by freddo411 (Chaplain) on Jul 15, 2004 at 18:41 UTC
    You make some excellent points. It is hard to write code that won't "break" if the underlying OS changes significantly.

    Let's suppose that we start to make some software "bricks". That is to say we lay down some basic rules which, by fiat, will remain fixed. Future computing systems can have extra functions, but they must support our fixed standards. For example let's say that ASCII defines how our text files are encoded. (OK, OK I realize this is behind the times for good reason). Using a set of standards like this ( "bricks" ) we could begin to engineer some long lived software.

    It really isn't too different from some of the things we have today. Email message from 10 or 15 years ago are readable, and sendable by today's email clients, even if the email clients are vastly different (improved? hmmm, maybe not). There are some "bricks" out there already, and I think we'll see more of them in the future, especially if we consciously try to create them.

    -------------------------------------
    Nothing is too wonderful to be true
    -- Michael Faraday

      You make some excellent extensions to Husker's excellent points. Here are a couple more things that occurred to me as I read your post:

      Sure, bricks have been around a long time, they have always worked and still work basically the same way, and they offer a lot in terms of stability and value/cost ratio. But they have their limitations: a wall made of just bricks and mortar can only go so high before it needs to be designed with unacceptable thickness (value goes down while cost goes up). That's why a lot of builders use steel and concrete now; as technology develops better ways of building things, we can build bigger and better things. If we limit ourselves to just bricks, lots of things we take for granted today become impossible -- we'd be stuck with a "lowest-common-denominator" sort of existence (and more fatalities in earthquakes).

      Here the analogy to software holds up rather well. Sure, we can stick to ASCII-based stuff, or whatever sort of limitation seems like a "simple, stable, always there and always usable" approach. But that puts limits on inventing and using new things, doing more with less, and doing stuff that was impossible last year/last month/last week.

      This would suggest that the notion of "designing software for the next 200 years" is something that has a true and proper relevance for certain domains of software development, but not all domains. Someone will always want to read old email, old newsgroup postings, old web pages and document files -- if the means for doing this isn't stable for the next 100 years, there must at least be a reliable migration path for this data to keep it accessible, regardless of how software/OS/hardware species mutate over time. Durability has to apply to data more than it applies to the software that handles it.

      One other point, from a linguistic perspective: while there are obvious and fundamental differences between human languages and computer languages, these two sets share an intrinsic, unavoidable property: their forms change over time, and the changes are determined in part by the environments in which they are used. This fact has a massive impact on programming languages, and to assert that we can write software now that will make sense in 100 years (or even half that) is like asserting that people who read Neal Stephenson or Tom Clancy could just as easily read Chaucer. NOT.

        Durability has to apply to data more than it applies to the software that handles it.
        I can't agree with this more. I'd like to build consensus on this to the point where we can put this down in law, so that data that belongs in the public arena (voting, public accounting, public records) is in a permanent, open, and documented format.
        that puts limits on inventing and using new things, doing more with less, and doing stuff that was impossible last year/last month/last week.
        This is true. Innovation will occur that won't fit nicely into old paradigms. Continuing my RFC analogy, that needn't prevent us from adopting new paradigms that include backward compatibility with the old, as well as new features.

        -------------------------------------
        Nothing is too wonderful to be true
        -- Michael Faraday