http://qs1969.pair.com?node_id=314989


in reply to Re: Re: (OT) Programming as a craft
in thread (OT) Programming as a craft

It's a fair point, but I wouldn't say that I ignored it, just that the terms of the analogy require reinterpretation to 'fit' them to the reality.

A compiler and the copy command are very cheap, provided everyone who wishes to use your product has exactly the same compiler, and cp er.. copy command, but they don't, nor should they. Once you move out of your 'production plant', the effort required to copy and compile the product start to rise. Perl itself runs more places than almost anything else you care to name, but at what cost?

A Config that contains a little under 1000 variables.

A build process that, to be frank, makes the assembly instructions for your average automatic gear box, self assembly PC, even a full blown kit car I once assembled, look relatively simple by comparison. So complicated in fact, it is necessary to distribute and build two copies of perl in each dstribution. The first is a simplified version with just enough functionality to ease the problems of configurability, so that it can be used to glue the many other tools, configurations and utilities that are required, be used to build the full product.

The alternative approach is the packaged software route as exemplified by MS. With this, each application has to be pre-built to cater for all possible eventualities, and will only run on a very limited subset of target environments. Each application becomes enormous with the weight of its runtime configurability, despite the fact that late binding is available and that 'they' control the content and functionality of the environments that they target

If 'production', in software terms, meant the copying of the code (compiled or source) onto a CD (or server) and distributing it, then that indeed is cheap. However, it doesn't. Most manufactured goods leave the production facilities as finished products ready for immediate use. Software, even the best packaged consumer software -- which currently probably mean games -- is rarely "ready for use" as it leaves the production facilities. Even most games require a certain amount of expertise and knowledge on the behalf of the purchaser, in order to install them and set them up for use. Except for those that run on proprietory, single function hardware where the variables can be much more tightly controlled than with general purpose hardware. Eg. PCs.

Currently, software manufacturers leave the final assembly, tuning and shakedown of their products to the purchaser, and the costs of the training, expertise, man hours spent performing that final assembly -- and correcting it when it goes wrong -- are rarely considered along side the purchase price, except in highly dubious 'true cost of ownership' surveys.

So,whilst the anaogy leaves much to be desired, if you extend your thinking to encompass the complete process from initial concept to ready to use, the differences becomes less clear cut.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
Hooray!

Replies are listed 'Best First'.
Re: Re: Re: Re: (OT) Programming as a craft
by chromatic (Archbishop) on Dec 16, 2003 at 19:17 UTC

    We're agreed that some software needs customization before it's usable by customers. (I've never noticed the apparent similarity in root words before. I'll have to look into that.)

    My difficulty with the assembly line image is that, with a physical product, the assembly line uses interchangeable workers and strict processes to make identical copies of a physical product cheaply and efficiently.

    Aside from an emotional reaction against the idea of treating programmers as interchangeable pieces, I cannot not see the "make identical copies" part of the process. That, to me, is the reason an assembly line is possible! It exists precisely for mass duplication!

    Granted, there exist assembly lines dedicated to customization -- the worker who puts extra memory in laptops, for example -- but even then, the scope and breadth of the customizations are much, much smaller than in a software project.

    They also can't be divorced from the physical aspect. Certainly Apple could ship all PowerBooks with a gigabyte of memory, but they can't change one master PowerBook and duplicate it for every customer without touching every machine.

      Oh dear. Here I go again. (I do hope someone finds this worth reading).

      Okay, Upon further reflection and reading, I agree with you. My analogy of programming with mechanical engineering was fatally flawed. No amount of bending ones perceptions will make it really fit in the light of closer scrutiny, especially several of the points you made with regard to assembly lines and interchangeable people. Guess what. I got another analogy. Well, it's not really an analogy as such. More a historical (but hopefully not to hysterical) comparison. It also has flaws, but (IMO) it lends itself to a clearer comparison with the way I have seen the software development (as a process and as profession), progressing over the years.

      The comparison is with the what I'll term woodworking. This has been and is known by various other names, carpentry, cabinet making, furniture making, etc. and it's practitioners by an equally wide spread of names. The obvious terms that relate to each of the preceding specialisations, but also terms like chippy, shutterer, fencer, joiner, labourer, wood butcher.

      The former of this latter list may be strange to American ears, as I see from an on-line US-based dictionary that the term chippy has a couple of other connotations over there. One is a generic name for a small chirping bird, the other less salubrious. In the UK, a chippy is a slang term for a jobbing carpenter. One who owns his own tools and moves from job to job as the demands of the work take him. Often highly skilled and adept in many forms of his craft, he frequently ends up doing the more menial tasks of his trade in order to secure employment. Usually has to comply with both the subbies standards and time schedules and will fall into conflict if he tries to exert to many of his own preferences and ways of working or expends to much time attempting to comply with his own personal standards.

      In the beginning...

      every able-bodied male was a part-time carpenter. If you wanted a dry place to live, you learnt to cut trees and erect a shelter. If you wanted to eat fish, you learnt how to hollow out a log. If you, your family or your stock animals wanted to be safe from wild animals, including other humans, you learnt to build fences and stockades.

      As time passed, individuals more adept at woodworking than say, fishing or hunting or farming, began making boats and furniture and bows and arrows and all manner of other wooden items and trading the products of their skills for the products of those adept at things that they weren't. Trade was born and specialisation had started. It probably didn't happen quite that way, but it's enough ancient history for my purpose. For now anyway.

      Moving a few million years more up to date...

      we arrive at some time between say (with apologies to history buffs for my imprecision and inaccuracies) the 13th and 16th centuries where the guild system which gave people titles, developed apprenticeships, extending the father-to-son teaching traditions and further demarcating the skills and evolving the craft and defining craftsmanship.

      The skill of the woodworker (arguably) reached it's pinnacle in the mid-to-late 18th century with the likes of Thomas Chippendale. This man was not just an accomplished worker of wood, but also a celebrated, then and now, designer and published author on both the working of wood, but more importantly the design of wood. However, with fame and success there came a price. To quote from the above link

      Many fine pieces of furniture have been attributed to Thomas Chippendale, but verifiable pieces are rare. His designs were widely copied, and his 'Gentleman and Cabinet-Maker's Director' was used heavily by other makers in both England and North America.

      Even when a piece can be attributed with certainty to Chippendale's workshop, it is impossible to say for certain that he worked on the furniture himself. As the Chippendale firm became successful, more and more work was carried out by trained workmen rather than Chippendale himself.

      Note the phrase "trained workmen". Presumable they worked wood. So, are they not therefore woodworkers? How about carpenters and joiners? Cabinet makers? Craftsmen in their own right?

      Since WWII, the nature of woodworking has changed ...

      beyond recognition by those that went before. The advent of cheap power (electricity) and tools that use it, means that, all but the high skilled and consequently, highest priced practitioners, use less hand-powered and more machine-powered tools to do their work.

      Now, if not before, you are probably thinking, "Yes, but carpenters make things and those things are sold, and they cannot easily be replicated, but software can". And your right, but this isn't about production per se. It's about producing. It's about the performance of the physical and mental tasks that contribute to the process.

      Men like Joseph Hemingway above, ...

      work to another persons specification, but make them their own. They use their skills and time and hands to convert those specifications into what are more akin to (and carry the price tag of) works of art. A mental and physical and highly skilled art. A craftsmen sure, but a Master craftsmen, of which there are very few with those skills, and a market that can only sustain a very few, given the price of their produce.

      Then there are your average carpenter's.

      Neatly avoiding the fact that the vast majority of furniture produced today is produced using synthesised wood (to avoid the problems of variable grain), using large, fully automated machines that are minded by operators, who have no reason to know the difference between 'mortise and tenon joints' and 'muscles, tendons and joints'.

      There are still a good number of true carpenters, chippies. They work under contract to large housing firms, set up small companies alone or in groups, or work as employees for larger companies. They work on-site or in workshops. They are the guys who assemble roofs from pre-built truss assemblies, manufactured to generalised specifications by machines in factories. They build kitchens, usually from pre-build carcasses, but sometime with custom built doors, work surfaces and finishing. In any case, they still require a considerable amount of skill order to tailor the pre-built assemblies to the variations that are inevitable in hand-crafted assemblies like houses. Built to specs, but varying one from the other despite the advent of laser levels and machine made bricks.

      These are still skilled craftsmen.

      Working with their hands but also with tools to speed the mundane parts of their tasks. Power screwdrivers require a man to load the screws and decide when tight enough. Power saws still need the craftsman's eye to know where and how much to cut (besides just keeping the damn things straight:). Nail guns still require a human being to aim them. The skill is in the positioning, the direction and execution rather than in the manual effort and dexterity of Joseph above.

      Finally, there are the 'workmen'

      I use this term to include the shutterers, fencers, cladders and floorers. They job they do still requires and element of skill, but the skills are fairly quickly acquired by most physically able people. There is a deftness and performance that comes with experience and practice, but their major contribution is in terms of their time rather than their knowledge.

      Enough of the history lesson and woodwork already

      In the early days of programming, all programmers were akin to Thomas Chippendale and Joseph Hemingway.

      In the late 80's and 90s there were moved made to segregate the analyst from the programmer. To demarcate between the terms of that old style job title. The attempt was made to have analysts and architects design and specify everything about a piece of code down to the finest detail and then have programmers perform the physical tasks of assembly, constructing, call it what you will. The aim was to remove the coding from the specification and the design and the testing. The aim was to reduce costs by employing large numbers of 'cheap' coders to perform the mundane, repetitive task of typing in the code, naming the variables, writing the comments, compiling the modules and perhaps unit testing them. These then became a part of a larger assembly that were also pre-specified and assemble together and tested. And so on to the finished product. Even the style of the code and variable names are pre-ordained in coding standards and naming conventions. Remove as many possibilities for variability as possible.

      These attempts largely failed for various reasons, but mostly programmers have brains and want to use them and throwing bodies at software development, at least at salaries demanded in western societies simply isn't economic. There are other, more fundamental reason too but they are beyond my ability to adequately cover in a short (some chance) article.

      I assert ...

      that there is a middle ground that ensures that the productivity of programmers can increase sufficiently to prevent the need to take the jobs offshore. And that this can be done without rendering the job to the point of being a specialised typist (which didn't work anyway), or moving to an AI driven, automaton solution. And my assertion is that we, programmers, are the best placed people in the current world to make this transition happen. By utilising our skills to build better tools to help us in our own jobs.

      By moving away from the 1970s architecture of myriad, highly specialised, individual tools to integrated suites of tools that cooperate with each other at every level. Remove the need for a dozen or more discrete manual tasks, by every programmer, every day, and we can increase our own productivity. By automating a lot of the mundane but necessary housekeeping chores, that we must remember but often don't, we can increase our reliability and quality control.

      And I am not talking about point'n'click or drag'n'drop GUI generators or wizard driven programming tools, nor do I stop with the current notion of interactive development environments (IDE's).

      This should go way beyond ...

      having macro-drivable editors or batch driven build processes or even automated test suites.

      Imagine being able to specify a sort and have the compiler or runtime decide which type of sort to use. Thanks to Abigail we already have two types of sort available to us in P5, but we have to know it and manually select the appropriate one. In my first programming job, they had a FORTRAN callable, system sort utility that would perform a one pass 'probe' of the data and select a sort appropriate to the data at runtime. Nearly sorted data would use a different algorithm to random data. I belive that the current implementation of the quicksort method (the one you now need to manually select) has a test that will result in the data being randomised before it is sorted if that data looks as though it might be tending towards the worst case scenario for the quicksort algorithm. How hard would it be to add a add a few more? A disk-based, insertion or selection sort for large datasets. Others that are best referenced by reading Knuth than off the top of my head. Would it be so difficult to have the 'system' make these decisions for us?

      Extend that thinking anywhere you dare.

      This site has a fairly strong advocacy for using RDBMSs. And, despite recent threads, I don't disagree entirely with this. So why do we keep our code in source form and re-build it every run (in the case of P5) or re-link it in the case of other languages? And why do we keep it in a hierarchal file system and have to search for it anew each time? Wouldn't it make sense to keep our code in the database too?

      In the case of P6, why not place the byte code into a database and grab it from there when we need it? Not everyone has a database, but we distribute a large number of other utilities with the source. Would it be a huge burden to add a database. It needn't be a full-blown RDBMS (or SQLDBMS). A B+-tree DBM like Berkeley would probably do. Think of the advantages for backup, version control, debugging, upgrades.

      And if you go that far, why keep source code in flat files? Actually, why keep source code at all? P5 can regenerate source code from a op code tree. What's more, it even has (very limited) abilities for configurable formatting of the generated sources (-MO=Deparse,p). Extend that to include the usual list of code beautifier options -- curly placement, tab size etc. -- and your on the way to having a searchable, differentiable, backed up, source code control mechanism.

      You could go a stage further but I doubt many would like the idea. How about if the compiler/interpreter was an integral part of the editor. Or vice versa. As you type the source code, it is background compiled, line-by-line, and the byte code placed directly and transparently into the DBM, with change history record and back up and 'infinite' undo, across editor session, re-boot and even programmer boundaries. A useful facility? What if the editor kept track of the names of variables across libraries, application suites, whole systems and warned you when you made a typo, or duplicated the names for variables of different types or... Would that be of value? Would it reduce bugs? Improve the lot of the maintenance programmer by helping to ensure consistent naming for the same or similar variables types and parameters between modules, callee and caller?

      Imagine typing a specification for a class in terms of its attributes and the accessor and mutator methods (if such things are required) are produced for you. As a one-time hit, right there in the editor rather than as an 'every runtime' cost, as with similar mechanisms available now in P5?

      Once you have specified your class, the language allows you to specify constraints for the creation parameters. In some languages these are done as runtime assertions, but I use the term constraints because if the class was schemed as you typed it. And if the runtime creation of instances of that class were automatically and transparently persisted to the DB, then the assertions/constraints specified for the attributes of the class, would become foreign key constraints within the table. The attributes of the class are defined in terms of the base DB types, or of other existing classes within the DB.

      Taken to this level, many of the discrete tools we use as a daily part of the programming task start to become redundant as their functions become integrated into and performed by the DBM itself.

      Once this level of design and implementation has been committed to the DBM, it also forms a source for automating, to some degree or other, the process of documentation.

      It also could be used to generate at least some of the required tests, with the DBM itself performing much of the sanity checking, both as the programmer types and later at runtime. The tests themselves begin to sit within the auspices of the DBM, and become self documenting and journalled as a result. Compiling reports of tests performed and results obtained becomes almost trivial using SQL (or preferably an implementation of D), as the tests themselves and the statistics produced are automatically recorded and collated within the DB.

      I'll stop here.

      The analogy/comparison is again not perfect, and I have taken some liberties with history. Hopefully, the comparison serves it's purpose of supporting and justifying to some greater or lesser extent the speculation and blue-sky futurism that followed it.

      Has anyone written about this before? If so, I haven't read it, but I would appreciate any pointers from anyone who has.

      Is it all a step too far? And based upon trying to cure a nonexistent ill within our industry, born of an overly excited bout of skepticism?

      Anyone familiar with the history of the British Motorcycle Industry may think not. The parallels aren't that strong, but I think there is something there.


      Examine what is said, not who speaks.
      "Efficiency is intelligent laziness." -David Dunham
      "Think for yourself!" - Abigail
      Hooray!