in reply to Re: Re: Re: Re: Re: A Perl aptitude test
in thread A Perl aptitude test

Wrong rant.

My comments are based on a situation where a small team that included me both developed and did maintainance on an evolving codebase over time. I was one of the maintainance coders (and was the fallback if things got out of hand), and don't think that lack of ability on the part of the maintainers was an issue.

Rather my comments have to do with a recognition of the fact that good coding is all about making code clear and understandable. It isn't about showing off how crazy a line of punctuation you can spew to get things done in 2 less lines. If you have someone who has that priority wrong, and thinks that their demonstrations of cleverness are a good thing, they either need to learn better, or if uneducable, be fired.

As for the good environment that you praise, just think about how someone who was cocky and determined to lay claim to stardom would have fit in. Not well, huh? Well those are the people who I am pointing out as being problems. By contrast the simple awareness of what will and won't be understood based on the feedback you describe is enough to achieve the goal that I am after. Not a binding of good programmers with sufficient rules to make them by force understandable to relative idiots, but just an awareness of how the team works, and what kind of code will be clear to the rest of the group.

  • Comment on Re: Re: Re: Re: Re: Re: A Perl aptitude test

Replies are listed 'Best First'.
Re: Re: Re: Re: Re: Re: Re: A Perl aptitude test
by BrowserUk (Patriarch) on May 07, 2003 at 22:54 UTC

    Given your agreement off-line, this seems as good a place as any to continue our discussion.

    I guess, as someone who makes a habit (and a career:) of, what I would term, fully understanding the systems, languages and tools I have worked with, and utilising them to their fullest, to effect solutions, you touched a nerve with various phrases.

    • ...develops an understanding of obscure constructs by constantly experimenting...
    • ...the tricks that the first show-off used...
    • ...but actually using them all all of the time is a bad thing to do unless you are trying to be hard to understand....
    • ...showing off how crazy a line of punctuation you can spew...
    • ...someone who was cocky and determined to lay claim to stardom...

    I have no idea whether I even appeared on your radar set when you made these statements, but if even if I didn't, the implication that I would fit into one or more of these disparaged groups has been alluded to from time to time here, and I guess if the cap fits, I tend to wear it.

    What I see as the difference between your stated position, as I interpret it and my own, is summed up for me -- though not necessarially by you, even though they are your words -- in the sentence.

    By contrast the simple awareness of what will and won't be understood based on the feedback you describe is enough to achieve the goal that I am after...

    I read this to mean that if, during peer reviews, certain constructs or techniques used by one or more of the team members were shown to be misinterpreted, or confusing to other team members, then those constructs and techniques would become candidates for the proscribed list.

    The approach taken at the organisation in question was to investigate such occurances and, provided that the code was without side-effects or technical flaws, and especially if it had some clear benefits -- including, but not limited to performance. The technique then became the subject of an "Advanced Techniques paper", was usually presented to the team when written and added to the project, language or tool documentation set as appropriate. The emphasis was on educating the less advanced, rather than limiting the more advanced.

    The reason that there were no 'stars' in the group was simply that the overall standard was so high, that anyone coming into the group, including those who came with a sense of their own stardom, rapidly found that they were, at best, one among many. The effect was salutary. All but a very few that joined whilst I was involved, went through a short period of floundering as they adjusted to the standards in the environment. Then a little wariness as they became accustomed to the peer review process and the way in which help was sought and given so freely, before adjusting to it and revelling in it. The few that did not make the adjustment, usually because they came from backgrounds where you protected your skillset as a way of life, generally moved on of their own volition.

    I now realise that giving the specific example has tended to obscure rather than clarify the crux I was trying to get at :(.

    The assertion I interpreted you to have made was that using the (as I would have it) full expressive power of perl to code algorithms concisely becomes a liability when the notation used to acheive it, falls into that area you variously term as "obscure constructs", "tricks", "showing off", "{as} crazy a line of punctuation you can spew".

    I have several problems with that.

    1. Where do you draw the line?

      When I first encountered perl, despite a strong programming background in other languages, I found even the simplest perlisms, confusing, frustrating and obscure. Whether is was the usual perl beginners traps of $a[1] referring to @a not $a, or the confusion over when $_ was implicitly assigned in a while loop and when it was not; the subtleties of the various forms of magic, like <> versus <STDIN>, the autovivification of intermediate terms in a complex data structures even when checking for definedness, and a host of other similar things. I rapidly learnt to appreciate the reasoning behind these perlish ideas and learnt to read them and use them to produce leaner, and I would say clearer, code.

      Even as my knowledge of perl improves, I continue to see it used (especially here at PM) in ways that I hadn't thought of, that justify the nomenclature of VHLL over and over. And with each new way of expressing the same basic algorithm, comes some advantage (at least in some circumstances) over the previous. Sometimes the improvement comes by way of performance; Sometimes because it allowes the expression of the underlying algorithm in a more concise form that more closely matches the way I think of the algorithm.

    2. All other things being equal, less code means less errors.

      Discounting the obfu and golf strategies designed to either deliberately obscure or reduce the arbitrary metric of keystrokes, neither of which I pursue to any great extent although golf can be fun. I believe that writing concise code is good, not because it is sometimes more efficient, nor because of the 'laziness' ideal, nor even for the 'once and once only' factor, though that too can be beneficial not least when it comes to maintainance -- we've all fallen foul of modifying a peice of code in one place whilst failing to modify a similar piece elsewhere.

      I beleive concise code that achieves the required algorithm is good, because every metrics analysis I have ever seen indicates that the more code there is, the more errors there are. I never seen anything that indicated otherwise.

    3. Less is more when it comes to understandability and maintainance.

      A not entirely secondary reason, is that my own experience suggests that it is always easier to get a feel for the overall algorithm if it can be viewed in one go, than if you have to scroll up and down to see the different parts. This goes right back to my first ever commercial programming course for FORTRAN77 at DEC, where the instructor suggested that any routine over 2/3 (24 line) screenfuls should be broken into two subroutines. At the time, coming from an assembler and BASIC background, that was an anathema to me. Down the years, I've come to appreciate this advice more and more.

      Now in Fortran, the costs of splitting a routine into a couple of subroutines were minimal but in perl, they are considerably higher, but the overhead is more than offset by the expressive power of perl to do a lot with a few well written lines.

      The way I've come to think of it, that may or may not appeal to you, the mathematican, is that any programming language is simple a notation for expressing the algorithm in question. In the same way that advanced math usually relies upon a shorthand notation to express the ideas and steps of a theorem, so programming relies on the expressions of the language used.

      It would be considerably easier for me to follow mathematical ideas if all the formulea and proofs were written in terms of +-/*^ sin/cos/tan etc. with all the intermediate steps spelt out explicitly. Whilst many, if not all of them could be written this way, they would become so tortuous long and repetative, that mathemeticians have evolved notations that encapsulate many of the intermediate terms and steps very concisely. If you understand the notation, then the reduction in clutter makes understanding the overall equations and proofs much easier to comprehend -- or so I'm told, my maths stop way short of this:).

      This latter point is made better and more authoratively by Stephen Wolfram here when he notes.

      In early mathematical notation--say the 1600s--quite a few ordinary words were mixed in with symbols. But increasingly in fields like mathematics and physics, no words have been included in notation and variables have been named with just one or perhaps two letters. In some areas of engineering and social science, where the use of mathematics is fairly recent and typically not too abstract, ordinary words are much more common as names of variables. This follows modern conventions in programming. And it works quite well when formulas are very simple. But if they get complicated it typically throws off the visual balance of the formulas, and makes their overall structure hard to see.

    Whilst there is undoubtably a certain "show off" factor in devising more concise forms for expressing various algorithms in perl. Provided that the aim is reduction of clutter, in the form of removing redundant intermediate terms and steps; that efficiency isn't sacrificed; and deliberate obfuscation isn't the goal; then the result of the freindly competition is often a new idiom. Whilst it may look obscure the first time it is encountered, as with so many now-commonplace idioms, it rapidly becomes familiar through seeing it and using it.

    In my 10 months of perl and visiting PM, I have seen several new idioms come into use, and learnt many more though reading the work of merlyn, Abigail, Aristotle, Juerd, sauoq and many others. I've also witnessed the process wherebye they are used (and sometimes evolved) in the course of one thread, and then start to crop up in the questions and answers of other threads. This does not usually take very long to happen when the idiom is useful. The truely obscure ones simply disappear.

    I appreciate that the audience at PM is, by definition, an exceptional environment populated for the most part by enthusiasts, but with rare exception, every 'good' programer I have encountered over the last 30 years has not just programmed for a living, but was an enthusiast.


    Examine what is said, not who speaks.
    "Efficiency is intelligent laziness." -David Dunham
    "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller
      My apologies for taking so long to get back to you. I warned you that that would happen though, and that you should expect the same through more iterations of this conversation.

      First of all you suggest that your response is partly because you feel that I am criticizing a kind of person that you feel you are. This always complicates things. Please don't take my comments as personal criticism, but rather as explaining an important trade-off which is going to be made somehow, both of whose sides have advantages and costs. To me being aware of this kind of issue allows me to learn to use the most complex system around me - my co-workers - to the greatest extent possible. The technical tools are less than half the battle.

      The trade-off in question is between the advantages of being able to use subtle tricks for the programmer's convenience, versus the costs of maintaining the code which use those tricks. As usually happens in real life, this is not a trade-off which has a clearly right answer about where to draw it. You ask where to "draw the line". And my answer is that it is situation dependent. In practice what I tend to do is think hard about the problem and then decide what "bright lines" I think I can draw which are appropriate for the situation, then draw them and do not cross without specific reason. And by "bright line" I mean a line which is very easy to recognize, and hopefully is not too far off from what I think is optimal in the situation. (BTW I got the phrase "bright line" from a couple of lawyers...) But in different situations I draw different lines.

      OK, so what are the signs that I use to say when someone is overusing tricks? Well I think it is fair to say that if a programmer has difficulty reading the code that they wrote 6 weeks ago because they don't remember the tricks used, then it is obvious that they are pushing language features more than they are ready for. Ditto if the programmer went out of his way just to use a cool trick. I also lean towards saying that anyone who can't clearly explain the subtle possible gotchas in a technique shouldn't be blithely using it. (I tend to be the person who sorts out what happens when they get bitten by the gotcha...and those just aren't fun bugs to track down and explain.) And if you have other people who have to back them up on the code constantly wondering how something could possibly work, then you have a problem of some sort (possibly more education needed on one side, possibly less trickery on the other *).

      Going the other way, if a programmer is finding themselves constantly hampered because they cannot organize code in useful ways, then that is a sign that you want to enable more features. I am less sympathetic to tricks that save a line or two here or there. I am much less sympathetic still to tricks that only enable marginal optimizations. (Premature optimization is the root of all evil...) OTOH I am very sympathetic to someone who wants to be able to write modules, use OO design, or throw around hashes of anonymous functions. I become even more so if the feature is one that will be used constantly, meaning that the effort of teaching other people can be amortized over more code.

      You can see one decision that I have made in real life based on this at Re: Re: Re: Hash sorting. My attitude is that the amount that I need to teach to be happy with

      while (my ($k, $v) = each %foo) { ... }
      is not really worth it when for complex code I can just do:
      foreach my $k (keys %foo) { my $v = $foo{$k}; ... }
      and not have to worry about someone forgetting the subtle gotchas with the more efficient idiom. However I have found it worthwhile to teach the mindset from which it is perfectly understandable to write:
      print join ",", map { &{$field_info{$_}{format}}($deal) } @fields; print "\n";
      though I might actually write that as...
      my @field_subs = map {$field_info{$_}{format}} @fields; # time passes.. print join ",", map $_->($deal), @field_subs; print "\n";
      The reason that I treat the two idioms differently? The latter enables me to organize the overall program differently - big win - and once the mindset is understood it is easy to remember why it works. The former saves a line or two and avoids an unnecessary copy - small win - and having learned the trick, if you go away from Perl for a while it is easy to forget important facts about it.

      Plus the latter idea can be taken and used in languages like JavaScript or Python. The context-specific coding trick cannot. More bang for the buck.

      Follow-up nodes with more on this issue that you might want to look at: Why I like functional programming (explains the map trick above), Re (tilly) 1: Teaching Perl Idioms (says what subset I last used), Re (tilly) 1: to perl or not to perl (suggested subset for writing Perl for Perl beginners), RE: RE: Shot myself in the foot with a pos (me having this argument with chip and RE (2): Filehandle Filter (I am not choosing out of not having learned more than I use). The trade-off that I am talking about between how much you have to keep in mind while programming and the utility of keeping it in mind reminds me of my thoughts about global variables (in obvious and disguised forms) at Pass by reference vs globals. More general context on my attitudes towards learning can be found at The path to mastery and Re (tilly) 1: Discipline.

      * Note that I don't always say that the first programmer needs to use fewer tricks, that is as wrong as saying that every programmer should learn every possible technique before they do anything. Hopefully this answers your question about whether causing confusion in others makes something a candidate for the "proscribed list". Well, you should think about it. But it is just as likely that you will instead decide that "people around here should understand that", and think about what you can do to make sure that people are at the desired competence level.

        My apologies for taking so long to get back to you.

        No problem. I never saw a debate worthy of the name that wasn't improved by the time to think. I would have had considerably less strife in my life had I had an (erasable) output buffer on my mouth :).

        First of all you suggest that your response is partly because you feel that I am criticizing a kind of person that you feel you are.

        Think no more of it. If I felt 'slighted' in anyway by your words, it was only because I chose to. If your words incite sufficient passion in me to respond, it is just an indication that I have some level of interest in the subject matter. Parhaps the greatest of life's lessons and one of the hardest to grasp is that the opposite of love is not hatred, but disinterest. I promise not to be offended by anything you say on the basis that you'll do the same :).

        That doesn't mean I won't argue with what you say, only that I won't take it as directed at me personally, but rather at whatever "group hat" my screen personna is choosing to wear today :). That I will often use "I" rather than "they" or "we", is simply indicative that I am rarely comfortable speaking on behalf of others and trying to write without using pronouns results in turgid, incipient prose.

        * Note that I don't always say that the first programmer needs to use fewer tricks, that is as wrong as saying that every programmer should learn every possible technique before they do anything. Hopefully this answers your question about whether causing confusion in others makes something a candidate for the "proscribed list". Well, you should think about it. But it is just as likely that you will instead decide that "people around here should understand that", and think about what you can do to make sure that people are at the desired competence level.

        Rather than respond to the rest of your post blow-by-blow, I'll use the above as a placeholder.

        Had you used the word 'techniques' everwhere that you used the word 'tricks', in this or your original post, I doubt that we would be having this discussion, but where would be the fun in that:). The line I was looking to see you draw is what distinquishes a 'technique' or 'idiom' from a 'trick'.

        I've promised myself that I would try and avoid using analogy on-line as it usually leads to more confusion than clarification, but since you've mentioned lawyers:). The phrase currently resounding in my head is 'In loco parentis'.

        In the specific cases you cite, whilst there are undoubtably gotchas involved in using the while each form over the foreach keys form, the reverse is also true.

        I have two problems with not using the while form the loop.

        1. foreach does not scale well once the number of keys grows beyond certain limits.
        2. A much more incidious problem is that what you are doing by using the foreach method to avoid the 'one iterator' problem inherent in the while form, is diguising the real problem.
        There was a recent node (that I can't find right now but may do before posting), where the OP was confused by the weirdness he was experiencing when using splice on an array whilst iterating over it within a foreach loop. To my way of thinking, the problem here lay with lack of awarness of the nature of the foreach mechanism (ie. aliasing, list expansion).

        So, by using the foreach instead of while, you are protecting the OP from the gotchas of the 'one iterator' but leaving them vulnerable to the non-scalability of foreach as their project grows. The former will usually fail quite quickly once a change is made in the body of the loop that alters or resets the iterator of that loop. Whilst the latter will tend to 'creep in' as the scales grow over time and code that has been working correctly for a long time and that hasn't been modified, starts first slowing down and then failing intermittantly because the size of the list in the foreach loop has grown to the point it where it starts bumping its head.

        You are also avoiding the need to mention that any form of manipulation of the source of a list whilst iterating over that list (using either mechanism) has to be done with serious consideration to the side-effects such manipulation might have. This latter caveat is one that is inherent in every language and is a fundemental tenent of programming.

        Don't mess with contents of a data structure whilst in the process of iterating it, without giving serious consideration to the effects of doing so.

        The question I have here is, are you doing them any favours? Isn't it better to either mention the limitations of the code up front, perhaps mentioning the alternative and its inherent limitations, and so cause them to think about both solutions and associated caveats? That way, when their code starts to fail as the scale grows or the code in the loop is modified beyond its original use, they may remember and have a notion as to what to look for.

        In the end it translates to there being no substitute for being aware of the ins and outs of each technique (trick) and applying that knowledge in the context of the problem at hand. No amount of rules-of-thumb, selective deprecation, or "thou shalts" and "shalt nots" will ever substitute for understanding of the underlying mechanisms. There is also a lot to be said for learning from your own mistakes. There are industries and professions where this is an absolute no-no. Airline pilots, doctors, judges etc. and in these cases, there are mechanism for preventing catastrophic errors by these types of people. That's why it takes 7 years to qualify as a doctor and 15 or more as a judge (a guess and legal system varient).

        There are skills that can be rote learnt, or picked up from a " ... for Dummies" or "Teach yourself ... in 30/21/14/7/3 days/hours/minutes" book:), but coding does not fall into this category. If someone learning to code is only doing so to "get the job done", then they would be better advised to employ the skills of someone with the requisite experience. However, if they have an immediate need to solve a specific problem, but are ready to take on-board learning for the longer term, then I think giving them an immediate solution to that problem whilst (breifly) mentioning any inherent caveats is ok.

        Likewise, if the problem is that a specific snippet of their code is 'taking too long', or 'consuming too much memory', and there is an advanced, even obscure technique (trick) that will (even if temporarially) alleviate that specific problem, then I have no problem with suggesting that technique whilst mentioning that the long term solution to their problem may be a better algorithm or even bigger hardware.

        With respect to the links you gave, I had already read most if not all of that, and most of your previous posts too, though not necessarially in context, and it is all interesting. Personally, I'm hoping that with your return, that some of the open debate that took place back then will also return, it has been sadly missing around these parts recently.

        Relating this all back to where this thread started. My reasoning for suggesting the question "When would you consider not using strict?" rather than the original "What are the advantages of use strict?", are that I beleive that it would elucidate more insightfully about the responder than the latter.

        I would be as wary of the responder that said they never would consider not doing so, as I would the responder that said they never did use it. It doesn't take much effort to discover that use strict is generally thought of as a "Good thing!". If the person saying that they never used it, does so through ignorance--either because they hadn't done enough actual coding or at least enough research to know that saying this is quite likely to raise serious questions--then the chances are that I am going to "discover" this ignorance through one or more of their other answers.

        If however, they do so with a full knowledge of the implications, then I am quite likely to take them seriously as a candidate for the post. Their action in being bold enough to state this position, inspite of knowing how it might be received, indicates (to me at least) someone who bases their judgements upon their own thought processes and decisions rather than upon the collective conciousness. This is a person that I would want to work with. Here I go with one of those "In my experience statements", but in this case it happens to be so; This type of person can often contribute more to a project than those that consistantly toe the party line. (Mr. Segars, where are you now?)

        All of that said, I don't think that in most specific cases, yours and my positions are so very far apart. The difference is in how we would go about handling them. I find myself having grave doubts about making this final statement, as I would hate for it to be widened into the general debate which would become totally unrewarding in this environment and I've already ranged over a far wider area than the inputs merit, but as a generic statement:

        I don't believe prohibition to be the solution to any problem.

        There will always be idiots like me who never got over their teenage angst against "Don't do as I do, do as I say" and will want to understand "Why not!"*

        * Note: That isn't a question:)


        Examine what is said, not who speaks.
        "Efficiency is intelligent laziness." -David Dunham
        "When I'm working on a problem, I never think about beauty. I think only how to solve the problem. But when I have finished, if the solution is not beautiful, I know it is wrong." -Richard Buckminster Fuller