It seems like most Perl scripts, along with most utilities that come with Unix-like operating systems, rely on command line switches to control them. If they don't recognize any of the flags you used, they just instruct you to run it again with -h or --help specified.

Maybe it's because I mainly use Windows, but whenever I write a small command line script, even if it's just for my own use, I always have it prompt for input if none is specified. For example, these two commands would accomplish the same thing:
> perl script.pl -f filename.txt > perl script.pl Which file would you like to use? filename.txt
I can only recall having seen one program that did this, which was a Perl script that walked you through setting up users for mySQL.

Is there a good reason for not doing it my way? It seems like that would make the program much more accessible to users who can't remember all those flags. Is it just a case of these programs traditionally being used by advanced Unix users or called by other programs?

Replies are listed 'Best First'.
Re: A thought about usability in Perl and Linux in general.
by holli (Abbot) on May 18, 2005 at 09:22 UTC
    I don't like scripts that require user input. It's hard, sometimes impossible, to run them in a batch job.
    In your case, where user input is optional, it's ok for me. But that doesn't mean I would ever user it ;-)


    holli, /regexed monk/
Re: A thought about usability in Perl and Linux in general.
by derby (Abbot) on May 18, 2005 at 12:38 UTC
    It's more of a philosophical difference (google unix way). Toolbox, Filters and Piping are the key concepts at work here - the ability to chain together n number of programs to do what you want.

    So while a prompt works great for your environment it rather sucks in a filter/pipe enviroment - the chaining concept breaks down fast if a program within the chain blocks to ask for some input that could have easily been supplied on the command line.

    -derby
        Toolbox, Filters and Piping are the key concepts at work here - the ability to chain together n number of programs to do what you want.

      That is exactly what I was thinking. Take the classic Unix command line:

        grep -i '^fred' $( ls f*.txt | fgrep -v 42 ) | sed -e's/Baltimore/Glen + Burnie/' | troff -mm -rN5 | lpr &
      I count six different commands in that statement, and four take no arguments. The Unix philosophy of stringing together smaller utilities into a bigger tool encourage that sort of thing. DOS, which is still the basis of the Windows approach, has/had no such infrastructure to encourage this behavior. In fact, it probably discouraged it.

      It's odd, but I see this affecting the approach to software that we take in my group at work vs another group that is all VB/Java. They mock and fear the command line, while those of us in the Unix world tend to think "Yeah, yeah, just because you don't understand it."

      --
      tbone1, YAPS (Yet Another Perl Schlub)
      And remember, if he succeeds, so what.
      - Chick McGee

Re: A thought about usability in Perl and Linux in general.
by ghenry (Vicar) on May 18, 2005 at 09:39 UTC

    I think it's a good option for installers, but really, just depends on what the program is supposed to be doing. If it can do only one thing, then it's best to hard code these things or have them in a config file. A standard Linux program should have something in /etc also.

    If a user is going to the commandoline, then they usually know what they are doing, otherwise they would be using the GUI tools for installing things. If they forgot the switches, with half a brain, they would read the man page (if it has one).

    It also creates a lot more work for the programmer and we're all lazy.

    Gavin.

    Walking the road to enlightenment... I found a penguin and a camel on the way.....
    Fancy a yourname@perl.me.uk? Just ask!!!
Re: A thought about usability in Perl and Linux in general.
by brian_d_foy (Abbot) on May 18, 2005 at 17:30 UTC

    I'll produce error messages for missing but required input, but that's it. I want my programs to be used however the user needs it, including in shell scripts and cron jobs. I can't rely on any particular use.

    Programs that insist on prompting for things are much more work to automate. You end up with entirely new languages like Expect (or the Perl equivalent of Expect) where you have to fool the computer into thinking you are a human, and you have to know the prompt strings and other interface bits. I typically want to spend as little time as possible with the computer, so I don't typically don't design programs that make me actually be there.

    That being said, some programs might have some sort of flag to turn on "interactive mode" or some such. I have no problem with that sort of interaction as long as it isn't the only sort and as long as I don't have to work to turn it off. :)

    --
    brian d foy <brian@stonehenge.com>
      When a program prompts for missing but required input, that's not going to break any shell scripts or cron jobs. It might cause mysterious hanging instead of simple death when used incorrectly, though.

      Caution: Contents may have been coded under pressure.

        I don't prompt for it, I issue an error message. That stuff ends up in a log file or email message, and people can discover what they need to change.

        --
        brian d foy <brian@stonehenge.com>
Re: A thought about usability in Perl and Linux in general.
by samizdat (Vicar) on May 18, 2005 at 12:32 UTC
    Certainly if you build code from a personal template, you can have such an ability built in. I start from scratch because I'm usually solving a problem with Perl that is a known task.

    Command-line-oriented programs are very flexible. I built a 20-page program in bash that could make a *nix system sit up and beg, and it had about thirty major variants of - flags as well as the ability to execute other programs from the launched shells.

    *nix users are typically more comfortable (than GUI users) with options in return for the freedom they give to customize the script's behavior. You can launch twenty things in the time it takes to sequence through one wizard. :D
Re: A thought about usability in Perl and Linux in general.
by zentara (Cardinal) on May 18, 2005 at 11:46 UTC
    You can write scripts that do it both ways. Just check for @ARGV, if they exist, go on with the script. If they are not there, prompt the user for values.

    Installing Perl from source is a good example of this. A minimal example is:

    #!/usr/bin/perl use warnings; use strict; my $count = 0; if( $ARGV[0] ){ $count = $ARGV[0]; }else{ print "Enter integer to count to, and press return\n"; $count = <>; chomp $count; } for(0..$count){ print "$_\n"; }

    I'm not really a human, but I play one on earth. flash japh
      From the OP:
      but whenever I write a small command line script [...] I always have it prompt for input if none is specified.
      Period.


      holli, /regexed monk/
      In the Unix world, invoking the command without parameters generally makes it choose a set of well-documented defaults.

      Prompting for parameters is also reasonable, but if you do that, please check that STDIN is a tty using the -t test.

Re: A thought about usability in Perl and Linux in general.
by etcshadow (Priest) on May 18, 2005 at 17:58 UTC
    I think that the single biggest reason is because of the idiom under which many scripts/unix-like programs are written is this: accept files (or pipes) for input, either by named files/named pipes in the command line or as stdin. Likewise, they quite typicall use STDOUT for their output (not for prompting users questions and such), although they will often have some sort of switch that allows you to name the file(s) for output. Any reporting is typically done to STDERR, so that pipelines and/or output data files are not clogged up/confused with miscelanious reporting.

    The fact that many, many things behave this way, tends to make people implement other things in a compatible fashion... even if they are not something that is reading in files and/or writing out files.

    Anyway, you ask why, and I think that's a pretty big part of why.

    update (addition): I meant to point out: perl, itself, supports this mode of thinking very much so, by way of its -n and -p switches, and it's "diamond operator" <>, which all implicitly iterate over opening and reading from files in @ARGV or reading from STDIN if @ARGV is empty.

    ------------ :Wq Not an editor command: Wq
Re: A thought about usability in Perl and Linux in general.
by Cap'n Steve (Friar) on May 18, 2005 at 19:33 UTC
    Thanks for pointing out the fact that pipes might be involved, I hadn't thought of that. So I guess I didn't give the greatest example, but I still think there are many programs that would benefit from this approach. It just annoys me when I have to prepare to run a program by going over the help message and slowly building the command to run.
      It just annoys me when I have to prepare to run a program by going over the help message and slowly building the command to run.
      This is just a matter of habituation (and maybe what sort of command-line shell you use). In your typical modern unix shell (bash, ksh, zsh and so on), hundreds of previously executed commands are kept in a command-history list; this list is kept in a file when you exit / log off / shut down, so that you can recall commands from yesterday or last Friday or after three reboots; there are simple, fast keystrokes for searching backwards in the history for each command containing a given string, and for editing command lines for re-execution; there are shortcut "placeholders" for using an argument from a previous command in your next command, and lots more features.

      Then there's the common X-Windows environment, with 3-button mouse for quick and easy copy/paste within and across windows, and the "xterm" window for running a shell, with its ability to scroll back over hundreds of lines of prior shell i/o. Put all these together, and building command lines of arbitrary length/complexity is pretty quick work -- helluva lot faster than answering a series of questions that each ask for just ... one ... single ... parameter ... (and can only be answered manually).

      In contrast, the typical "MS-DOS Prompt" shell is pretty painful to use on a repetitive or continuous basis, and there are several versions that are annoyingly different in the amount of command history/editing support they provide. The standard window used to run the shell is no help, with limited width/height/scroll-back and klunky mouse support (if any). Overall, using this sort of shell requires a lot more re-typing, and trying to use copy/paste operations involves lots of extra steps and higher risk of mistakes and wrist injury.

      And if you're running a program that actually asks questions (one ... parameter ... at ... a ... time ...), God help you if you get to the fourth question and notice a mistake you made in answering the second one. (Start over, from the top -- retype everything -- repeat till your typing is perfect!)

      Much better, I think, if the program will just say something to the effect of "Your last attempt at a command line was insufficient or incorrect; a proper command line should look like this: .... Please try again." Whenever possible, "option" args (e.g. "-v", "-log", etc) should be optional (meaning the program will do something sensible without them); it's okay to demand that there must be an arg for this or that purpose (input file, output file, or whatever), and if such args are missing, die and tell the user what's needed.

      When a command dies for lack of proper usage, the good command-line user just hits the "up-arrow" key to recall the previous command, add or fix the arg(s) as needed, and hit "enter" to try again. Repeat until it comes out right -- but from one repetition to the next, you should never have to type the whole damn thing over again.