in reply to Re: (OT): 200-year software
in thread (OT): 200-year software

You make some excellent points. It is hard to write code that won't "break" if the underlying OS changes significantly.

Let's suppose that we start to make some software "bricks". That is to say we lay down some basic rules which, by fiat, will remain fixed. Future computing systems can have extra functions, but they must support our fixed standards. For example let's say that ASCII defines how our text files are encoded. (OK, OK I realize this is behind the times for good reason). Using a set of standards like this ( "bricks" ) we could begin to engineer some long lived software.

It really isn't too different from some of the things we have today. Email message from 10 or 15 years ago are readable, and sendable by today's email clients, even if the email clients are vastly different (improved? hmmm, maybe not). There are some "bricks" out there already, and I think we'll see more of them in the future, especially if we consciously try to create them.

-------------------------------------
Nothing is too wonderful to be true
-- Michael Faraday

Replies are listed 'Best First'.
Re^3: (OT): 200-year software
by graff (Chancellor) on Jul 16, 2004 at 04:47 UTC
    You make some excellent extensions to Husker's excellent points. Here are a couple more things that occurred to me as I read your post:

    Sure, bricks have been around a long time, they have always worked and still work basically the same way, and they offer a lot in terms of stability and value/cost ratio. But they have their limitations: a wall made of just bricks and mortar can only go so high before it needs to be designed with unacceptable thickness (value goes down while cost goes up). That's why a lot of builders use steel and concrete now; as technology develops better ways of building things, we can build bigger and better things. If we limit ourselves to just bricks, lots of things we take for granted today become impossible -- we'd be stuck with a "lowest-common-denominator" sort of existence (and more fatalities in earthquakes).

    Here the analogy to software holds up rather well. Sure, we can stick to ASCII-based stuff, or whatever sort of limitation seems like a "simple, stable, always there and always usable" approach. But that puts limits on inventing and using new things, doing more with less, and doing stuff that was impossible last year/last month/last week.

    This would suggest that the notion of "designing software for the next 200 years" is something that has a true and proper relevance for certain domains of software development, but not all domains. Someone will always want to read old email, old newsgroup postings, old web pages and document files -- if the means for doing this isn't stable for the next 100 years, there must at least be a reliable migration path for this data to keep it accessible, regardless of how software/OS/hardware species mutate over time. Durability has to apply to data more than it applies to the software that handles it.

    One other point, from a linguistic perspective: while there are obvious and fundamental differences between human languages and computer languages, these two sets share an intrinsic, unavoidable property: their forms change over time, and the changes are determined in part by the environments in which they are used. This fact has a massive impact on programming languages, and to assert that we can write software now that will make sense in 100 years (or even half that) is like asserting that people who read Neal Stephenson or Tom Clancy could just as easily read Chaucer. NOT.

      Durability has to apply to data more than it applies to the software that handles it.
      I can't agree with this more. I'd like to build consensus on this to the point where we can put this down in law, so that data that belongs in the public arena (voting, public accounting, public records) is in a permanent, open, and documented format.
      that puts limits on inventing and using new things, doing more with less, and doing stuff that was impossible last year/last month/last week.
      This is true. Innovation will occur that won't fit nicely into old paradigms. Continuing my RFC analogy, that needn't prevent us from adopting new paradigms that include backward compatibility with the old, as well as new features.

      -------------------------------------
      Nothing is too wonderful to be true
      -- Michael Faraday