in reply to Re^4: Steve Yegge on how to build IDEs and improve speed of dynamic languages
in thread Steve Yegge on how to build IDEs and improve speed of dynamic languages

It's perhaps the weirdest thing about programmers. They'll spend countless hours persuading their grandmothers to keep their knitting patterns and recipes and photos on a computer. And countless more automating drawing the curtains, making coffee, and anything they can. But when it comes to using computers to assist the the process of programming, it's like you'd asked them to dance naked in a fire-pit.

I don't trust most humans to assist the process of programming. The way I see it, a computer is useful at doing repetitive tasks. A good program, following DRY, isn't repetitive. So, where does the computer fit in?

The reason DRY works is because of modules, such as Moose, Catalyst, and DBIx::Class. I'd prefer to have the computer work through the medium of a module than the medium of an IDE. I don't have to think about the module where I have to think about the IDE.

And, personally, I enjoy dancing naked around a firepit. In one is a slightly more burning matter. :-)


My criteria for good software:
  1. Does it work?
  2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
  • Comment on Re^5: Steve Yegge on how to build IDEs and improve speed of dynamic languages

Replies are listed 'Best First'.
Re^6: Steve Yegge on how to build IDEs and improve speed of dynamic languages
by GrandFather (Saint) on May 14, 2008 at 01:30 UTC

    Computers are good at repetitive, tedious, fiddly and complicated tasks, and they are good at remembering lots of stuff. As BrowserUK implies, a good IDE leverages those characteristics to pick up errors early and to make it much easier and faster to find the information that is required during program development.

    There are a few astounding individuals who seem to be able to hold huge amounts of information about how a piece of code hangs together in their head and access the pertinent information they require with apparent ease. Most of us can't do that. Having a bundle of well coordinated tools that compensate for that lack allows the rest of us to participate in the game and generate quality output where otherwise we would get so bogged down in detail and tedium that we would never produce anything.


    Perl is environmentally friendly - it saves trees
Re^6: Steve Yegge on how to build IDEs and improve speed of dynamic languages
by BrowserUk (Patriarch) on May 14, 2008 at 00:04 UTC
    A good program, following DRY, isn't repetitive. So, where does the computer fit in?

    Well, checking that I typed the casing of the module name correctly as I type it could be useful.

    And remind me, is that Text::PDF::Array method name elementsOf or elements_of?

    How about generating those 10 or so almost identical lines that appear in the top of many of the testcase you have to write?

    ## # DBM::Deep Test ## use strict; use Test::More tests => 128; use Test::Exception; use t::common qw( new_fh ); use_ok( 'DBM::Deep' );

    Shall I go one looking for the opportunities? :)


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      Catching typos and integrating documentation ... now those are useful things, so long as they are requests and not requirements. I personally despise the hard-integration that some IDEs have that says "I will offer my suggestions NO MATTER WHAT YOU WANT cause I think you're a bloody idiot." At that point, give me Notepad or you'll have a piece of paper that says "I quit!" shoved up your nose.

      Now, Ovid and I have each separately created vim macros for working with test suites and doing syntax checks on a file. Sure, I can see the benefit of those things. I even have skeletons for when I create a new file that generates those lines you're talking about and it's different for a .t vs. a .pm/.pl. Though, frankly, I should be using something like Test::Class or some other xUnit framework so that my setup and teardown can be DRY'ed.

      But, what does it tell you that you need a program to manage your project?


      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?

        Oh. I completely agree with you about non-optional mandates. Regardless of whether it's that stupid f'ing hairgrip that I suffered exactly 3 times before I hacked my corporate client's locked down workstation so that I could use wordpad to produce specs and stuff compatible with their expectations, but never have to see that triumph of marketing over common sense, ever again.

        I don't use an IDE. And haven't done so regularly since PWB nearly twenty years ago. But that's only because these days they tend to produce makefile equivalents (in XML for dog's sake!), that don't work with command line tools, couch everything in stupid terminology, and otherwise impose stuff rather than offer it.

        But, what does it tell you that you need a program to manage your project?

        I read the entire OP-linked article and don't recall seeing any reference to needing an IDE anywhere. Maybe I missed it? But a question: could you work without gVim? And if you can, would you want to?

        I think there is a big difference between preferring to use a toolset that you are familiar with and find benefit from, and being unable to do anything without it.

        That said, I still make a habit of washing my car by hand, which draws--shall we say inquisitive:)--looks from my immediate neighbours who both use power washers. I met a similar thing in the supermarket when I disputed the bill and declined the offer of a calculator to check the items. And I can quite see maps, and the skills required to read them, being a niche market thing 10 or so years from now. With satnav reducing to commodity prices, who needs them.

        Did you ever watch Norm Abrams, Master Carpenter in the TV program "New Yankee Workshop"? My carpenter father would have cringed to watch him making mortise & tenon joints using a router and jigs, but there is no doubt that he turns out workpieces much faster than my old man could. Could he do it by hand? Probably. The question is, why would he? And if the next generation of Master Carpenters can't, will their work be any the lesser for it?

        I remember a neighbour I worked with back in my apprentice days in a car factory, bitching that the skills required to 'wipe' visible body panel joints with molten lead were being replaced, by the then 'modern', epoxy products. He felt that using 'plastics' was a cheapening of the body finishers art. When he died 20 years ago from renal cancer, it wasn't proven that his many years exposure to lead fumes was responsible, but it probably didn't help.

        It's inevitable that as new tools and processes become available to any set of skilled workers, that some old traditions and skills will decline, and even be lost forever. The question is, does that inevitably mean an inferior product?

        Like many of my generation, I started out coding in assembler. and still do occasionally through choice. I don't know how your assembler skills are, but (if they aren't your best skills), I doubt you'd be best pleased if I accused you of being less of a programmer because you can't hand optimise a routine to do overlapping memmove operations. (Or whatever).

        The simple fact is that modern compiler technology can usually do a far better job of optimising code that stretches to more than a few dozen lines, simply because computers don't forget things when humans do. They can easily juggle and remember to take account of tens, hundreds or even thousands of concurrent parameters whereas the average human can manage 5 or maybe 7. They don't get bored, or get distracted by the football scores, or the need for coffee or the loo. They are infinitely patient and meticulous to a fault.

        Now imagine all the ways those skills could be employed in the work you do?

        Do you ever use one of those SQL gui's? I do. Bloody marvelous things. Catches 100% of my typos by offering keyword and variable completion. (You use tab completion in your shell don't you?) Reminds me that I have to add a Group by clause if I've used an aggregate function. Almost makes using SQL bearable.

        The Perl world is moving toward the use of declarative syntax for Class definition. (P6 Moose et al.) Now what if your DE reminded you when you coded a call to a method that it was still virtual and hadn't been implemented anywhere yet. Or that you just assigned a non-numeric constant value to a variable that later gets passed to a subroutine that will attempt to use it as a number. Sure, you'll find that out later when you compile or run the code, assuming that you exercise that particular path. But wouldn't it be nice to be notified as you did it Friday evening, rather than find out on Tuesday morning after the long weekend, when you try to integrate it with a bunch of other peoples code?

        In essence, imagine that all the things that your compiler & linker, or interpreter, discover and record when they are run as a command line (batch job), happened right there inside your editor as you type. Instead of you typing, saving a source file to disk, invoking the compiler that converts your code into a AST, discovers incompatibilities, converts those into file/lineno/error/context descriptions and outputs them to another file, that you load into your editor and use it to (manually or automatically) move to the appropriate point in your code in order to work out a corrective action. It all happened as you type right there in the editor. By the time you've finished typing your code, the editor can write out an object or bytecode file directly.

        Then take that one step further. Instead of writing your program to disk as either a source file or object file, it saves the AST directly into your source control database. Instead of diffs being done at the source code level, they are done directly on the AST itself. Rather than pretty dumb matching that flags whitespace, bracket placement and other source level variations as significant differences, it understands the structure of the code and far more accurately reflects only the semantic changes to the code.

        And when your colleague comes along to work on your code, his editor will expand the AST in a way that reflects his preferences for tab settings or brace placement Coding standards, or at least the code layout elements of them, become redundant because each person views the code in whatever way makes most sense to them. The code need never exist as conventional source code at all. Unless you choose to print it out, or perhaps move it into another DE.

        And once your toolset understands the semantics of the code, and doesn't have to bother re-parsing it over every time it is accessed, think of the possibilities for cross-module dependency checking and cross-referencing; automated documentation generation; automated testcase generation; automated test data generation.

        The limits are only what you can imagine, but only once we programmers collectively open our minds to the possibilities, and abandon our need to retain absolute control of our source code, by keeping it in the primitive unstructured data format called 'source files'.


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.