Beefy Boxes and Bandwidth Generously Provided by pair Networks
go ahead... be a heretic
 
PerlMonks  

Unix shell versus Perl

by eyepopslikeamosquito (Archbishop)
on Feb 18, 2008 at 06:39 UTC ( [id://668481]=perlmeditation: print w/replies, xml ) Need Help??

At work, we currently run a motley mix of Perl, Unix shell and Windows .BAT scripts. I'm putting together a case that Perl should almost always be preferred to Unix shell (and DOS batch). Here's why (🐚 vs 🦪).

Writing portable shell scripts is hard. Very hard. After all, you must run a variety of external commands to get the job done. Is each external command available on each of our platforms? And does it support the same command line options? Is the behaviour the same on each platform? (echo, awk/nawk/mawk/gawk, sed, grep/egrep, find/xargs, and ps are some example commands that spring to mind whose behaviour varies between different Unix flavours). Many of these portability concerns disappear when you write a Perl script because most of the work is done, not by external commands, but by Perl internal functions and modules. It's easier to port a shell than a shell script. :-) Note that we port the same Perl version to all our Unix and Windows boxes, and so don't experience any nasty Perl version portability issues.

Shell scripts tend to be fragile. Running commands and scraping their output is an inherently fragile technique, breaking when the format of the command output changes. Moreover, unlike Perl, shell scripts are not compiled when the script loads, so syntax errors may well be lurking down unexercised code paths: Perl catches syntax errors at compile time; shell catches them at run time. Notice that syntax errors can appear later when the script is presented with different data: test $fred -eq 42, for example, fails with a syntax error at run time if $fred contains whitespace or is empty (BTW, to avoid that, this code should be written as test "$fred" -eq 42). Finally, shell scripts can easily break if the global environment changes, especially the PATH environment variable. Even if the value of PATH itself doesn't change, a change to any file on the PATH has the potential to break a shell script (I remember one such case where GNU tar was mistakenly installed in a directory ahead of the system tar on the PATH on one of our AIX boxes).

Shell scripts, being interpreted and often having to create new processes to run external commands to do work, tend to be slow.

Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Which is why most Unices do not allow a shell script to be setuid. And shell doesn't have an equivalent of Perl's taint mode, to handle untrusted, potentially malicious, script data.

Error handling and reporting tends to be more robust in Perl. For example, $! set by a failing Perl built-in function provides more reliable and specific error detail than $? set by a failing external command (especially when the external command is undisciplined in setting $? and in writing useful, detailed and regular error messages to stderr). Moreover, the ease of using the common Perl "or die" idiom and the newer autodie pragma tends to make Perl scripts more robust in practice; for example, chdir($mydir) or die "error: chdir '$mydir': $!" is commonly seen in Perl scripts, yet I've rarely seen shell scripts similarly check cd somedir and fail fast if the cd fails.

As a programming language, shell is primitive. Shell does not have namespaces, modules, objects, inheritance, exception handling, complex data structures and other language facilities and tools (e.g. Perl::Critic, Perl::Tidy, TDD, Devel::Cover, Devel::NYTProf, Pod::Coverage, ...) needed to support programming in the large.

Finally, shell has fewer reusable libraries available. Shell has nothing comparable to Perl's CPAN.

Even if the script is small, don't write it in shell unless you're really sure that it will remain small. Small scripts have a way of growing into larger ones, and you don't want the unproductive chore of converting thousands of lines of working shell script to Perl, risking breaking a functioning system in the process. Avoid that future pain by writing it in Perl to begin with.

Notice that the above is arguing against Unix shell scripts, not specifically for Perl. Indeed, the same essential arguments apply equally to Python and Ruby as they do to Perl. However, I view Perl, Python and Ruby as essentially equivalent and, given our existing significant investment in Perl, don't see a strong business case for switching languages. I see an even weaker business case for using more than one of Perl/Python/Ruby because that dilutes the company's already overstretched skill base.

I'm further trying to come up with a checklist of when it is ok for work folks to write their scripts in Unix shell:

  • If the script must run in a Unix environment in which Perl is not available.
  • If you "know" the script will always remain very simple and short, less than around 20 lines of code.
  • If process startup time is significant. Note that shell has a lower startup cost than Perl.
  • If you're sure the script will "never" need to be ported to Windows.

I might add that I personally enjoy Unix shell scripting and have written a lot of Unix shell scripts over the years. It's just that I feel the argument for Perl over shell is overwhelming. Feedback on the above is welcome.

References

References Added Later

Updated 23 Feb 2008: Improved wording. Mentioned Perl's taint mode. Clarified Perl (compiled) v shell (interpreted). 26 Feb 2008: Added References section. Sep 2008: Added libraries (CPAN) paragraph. 2010: Added error handling paragraph. Mentioned newer autodie pragma. 2017: Added "References Added Later" section. 2023: Added more awk references.

Replies are listed 'Best First'.
Re: Unix shell versus Perl
by nimdokk (Vicar) on Feb 18, 2008 at 13:09 UTC
    I agree with what you have. Portability is the main reason I persuaded our team to use Perl instead of Unix shell and DOS batch jobs. Also, it makes it easier for new team members to learn how to script when they only have to learn Perl (as opposed to DOS batch and Unix KSH). The only KSH script that I have written for production type use is a very basic script and all the commands would be system calls anyway (could have been written in Perl but was just as easy to write in Korn). I have found that with Perl, it was easier to put in pretty solid error handling that is pretty reliable. Lastly, Modules make it easier to have repeatable routines that can be shared from one process to another.

    Just my 2 cents :-)

Re: Unix shell versus Perl
by samizdat (Vicar) on Feb 18, 2008 at 14:41 UTC
    Agree wholeheartedly.

    My criteria for switching (assuming Perl is available) is that if a script needs any testing or branching, it gets Perled. I have no problem with twenty lines of program calls, but if I need to start using any program control more complex than &&, Perl wins.

    I did have one large script (32K of ASCII including copious comments) I wrote at Sandia that was meant to be Bourne from the start, and it ended up being a really nifty system that included environment creation and program launching on the fly. In that case the reverse situation was true: Bourne shell was consistent, but the Perl version on the various boxen was not, and the systems support guy was not willing to fix all the various things that would have broken if we upgraded all Perl versions.

    We had our reasons for doing it the way we did, and it worked out because this really was a system-level task set. Doing it in Perl would have been more work because we would have had to add modules to do the things that shell does naturally.

    Don Wilde
    "There's more than one level to any answer."
Re: Unix shell versus Perl
by ruoso (Curate) on Feb 18, 2008 at 12:08 UTC

    I have a policy myself which is, whenever a shell script goes beyond 3 lines I stop and rewrite it in Perl. But one thing I will be very glad to use will be the Perl 6 ==>, since it's somewhat boring to chain piped commands from Perl 5...

    daniel
Re: Unix shell versus Perl (saw)
by tye (Sage) on Feb 18, 2008 at 17:05 UTC
    If process startup time is significant. Note that shell has a lower startup cost than Perl.

    I'd not include that item. It sounds very much like premature micro-optimization. Recently someone was ranting about 'sed' being better in some cases than Perl because it starts up faster. My testing (on two very different systems) showed Perl starting up about twice as fast as 'sed' and both starting fast enough that the difference was unlikely to be noticed. I think you'd be hard pressed to properly document how little your shell script can do and still keep this "advantage" over a Perl script.

    - tye        

Re: Unix shell versus Perl
by TGI (Parson) on Feb 18, 2008 at 18:49 UTC

    I decided to learn Perl because I was unsatisfied by what I could achieve with batch and shell scripting. Granted, I knew only a little shell scripting, but I did a lot of batch scripting. In particular, flow control and error checking are difficult and verbose in a batch script. Also, there are too many things that you just can't do without firing up a C compiler and writing a utility or searching for a ported unix tool. The unix shells are better off, since one rarely needs to hack up a custom tool, but one needs to learn awk and sed and find and a thousand other little programs to do really cool stuff. Perl's portability is key.

    The other thing that sold me on Perl over batch/shell scripting, is Perl's flexibility--once I built expertise in Perl, I knew I could apply it for many things other than the original problems I was trying to solve. Since you've already got a lot of Perl in your environment, using it in place of batch and shell will "deepen your bench"--more people will have and use skills that apply across your environment.

    As to reasons to write something in a shell or batch file, I wouldn't go for a checklist. I'd say simply that "Everything should be written in Perl unless there is a compelling reason to do otherwise." Too often I've "known" that something need not be portable or will always be small, only to be proven wrong. Neither "it doesn't need to be portable", "it's really short" are reasons to write a batch or shell script instead of Perl. If either condition is false, there needs to be a very strong argument to go with a batch/shell solution over Perl.


    TGI says moo

Re: Unix shell versus Perl
by graff (Chancellor) on Feb 18, 2008 at 21:51 UTC
    Regarding your "checklist of when it is ok ... to write ... in Unix shell:"

    I agree with tye that the third bullet (process startup time) is better off not being mentioned. It's hard to imagine a case where startup time would be both so significant and so reliably favorable toward shell scripting rather than perl. On the other hand, it's easy to imagine cases where a shell script's tendency to invoke lots of sub-processes (e.g. running sed or grep inside a for loop) places it at a serious disadvantage relative to a perl script that can easily do the same task within a single process. I know you already cite this as a reason for using perl instead of a shell script -- it's just that this third item in the checklist could lead some readers astray.

    Also, I would elaborate on the first item in the list: a unix environment in which Perl is not available would include (very significantly) situations where unix is being run in single-user mode, or in similarly "defective" conditions (restoring from backups, etc) where any of /usr/bin, /usr/local/bin, /usr/local/lib (and so on) are not present -- i.e. where there is nothing else but the shell (and the other stuff in /bin) for getting things done. I think this consideration will help people to focus on the appropriate domain and scope for shell scripting.

    And for the shell scripts that are to be written for situations like that, it should be strongly emphasized that the second checklist item (keep it simple and short) is likely to be a matter of "life or death."

Re: Unix shell versus Perl
by Abe (Acolyte) on Feb 19, 2008 at 16:47 UTC

    I have to step in and defend the shell,
    Done right, the shell will give you great productivity, better than perl for many tasks

    The trouble with the shell is is that programmers often forget structured programming. So one sees splats of shell done again and again to do simple things like file archiving, database access, file loads/unloads, ftps, error reporting, etc.

    Any task with complicated programming should be wrapped - in perl, C, C++, or a carefully done shell script or function.

    Any task done frequently should go to a library.

    Error handling - you can catch any error - you needn't lose anything. But you must NOT redirect scripts' stdout and stderr, except at the topmost level of control. If you're starting processes in the background, their failure can be signalled by touching a defined file, since you can't reap their exit status. Simple but effective.

    Meantime for foreground, errors are always sent back to caller in $?, same as perl.

    If you have to produce a script for several platforms, perl might be better, I don't know, but this is a discipline in itself. I'm skeptical though.

    ksh behaviour is standard - "echo" and "print" are defined. Korn shell 93 seems to be an attempt to do perl things, particularly hashes, which perl does much better.

    If you have to migrate platforms, it is fairly easy, but retest - you'd have to do the same for perl.

    Korn shell job control is wondrous compared with perl. Just start all your concurrent processes with & on the end, and then "wait" for them. This leads to very simple scripts that can efficiently run complex jobs with maximum concurrency of their different elements - very useful on our big RDBMS. Korn functions you start in the background, anonymous or named, can then further parallelise other tasks. Before you know it, and incredibly simply, you've got everything happening at once, and a 2 hours job is 10 mins. All the clever bits are in SQL, perl, C or whatever, with a couple of hundred lines of korn to glue it together.

    Windows .bat files are contemptible. For all but tiny scripts, shell or perl seem essential. What's amusing is how managers intervene to prevent this, insisting on VB or some custom interpreter, produced because perl/shell are "not invented here".

    We have a binary for executing SQL on the db in a simple manner, a shell script that wraps db bulk-load/unload effectively for our platform, and a couple of shell libraries to wrap, standardise, and de-skill things like ftp, errors and warnings, "ps" process checks, plus one or two other libraries and perl utils for larger, standardisable things

    Kornshell isn't suitable for complex programs, but properly done, it can simplify, speed up and provide efficient glue.

    Kornshell does need proper programming discipline.

    Perl is still fantastic, but I find kornshell much better for these jobs, glueing utilities and libraries quicker and better than perl.

    Incidentally it's perfectly easy to make similar mistakes in perl. A lot of our perl uses OO with too many inheritance levels - the code is so dense it's more like looking at COBOL - quite an achievement.
Re: Unix shell versus Perl
by peterdragon (Beadle) on Feb 18, 2008 at 23:44 UTC
    I've pretty much given up writing shell scripts longer than 20 lines.

    The main reason is error handling. Exceptions, warnings, controlled failures? Forget all that in shell script, one bad error and it fails, no warning, no log.

    A secondary reason is portability. Let's take "echo". Depending whether you're on Solaris or AIX or Linux my guess is you're wondering whether to run /bin/echo or builtin echo or echo -e or whatever to get a \n translated to carriage return. Have you ever looked at the shell script behind GNU configure? That says it all, and what it's saying is "should have been written in Perl!".

    Okay, backaways when you might have perl4 or perl5.05 or perl 5.06 out of the box with your distro it made sense to use shell for more portability. Since the advent of perl 5.06 the language has been stable enough (more so than sh/csh+Unix of choice) to be preferable.

    Regards, Peter
    http://perl.dragonstaff.co.uk

Re: Unix shell versus Perl
by spurperl (Priest) on Feb 18, 2008 at 18:49 UTC
    I have a vague recollection of something that can be done by .bat files but not by a Perl script. I don't remember for sure but it has to do with environment variables.

    IIRC you can set env vars in a .bat script that will persist after it exits, in the same command window. This is used, for instance, for setting up Visual Studio paths (the venerable vcvars32.bat script). I don't think this can be done in Perl ?!

      Your variables will stick around unless you do a setlocal, default behavior is to screw up your environment. I just love DOS.


      TGI says moo

Re: Unix shell versus Perl
by dwm042 (Priest) on Feb 19, 2008 at 19:03 UTC
    Early in my career, one of my first professional jobs was in a hard core korn shell/C shop. Easy stuff was done in korn. Anything requiring more than a page we wrote in C. Now, as I read this post, I think you're trying to focus on negative reasons to not use shell. Some of them seem forced to me (slow? What ksh programmer worth their salt uses, oh, an external call to sed or awk for what a read loop would do?).

    I think it misses the point entirely. The main reasons to switch are power and productivity. Rather than beating people out of skills they might be expert at, show them how much more can be done with a reasonably sized Perl script. If you tell them they'll get the same jobs done in a fraction of the time, a fraction of the space, and the result will do more than they could with shell, then you have a potent argument.

    Tell them they can write daemons in Perl. Tell them they can write dynamic web pages. Tell them they can set up small Internet servers. Tell them their scripts don't have to use email anymore for notification, they can open a port and send the data directly into their alerts system.

    Or, if they come from a hard core korn shell/C environment, tell them they can do in Perl most of what used to require C. And since Perl is an interpreter, you develop code faster.

      One job I had many years ago was writing a standard installer for a Unix app written in C. I used shell. But it had to run on AIX, ICL Unix, DEC Unix, Sys V SCO OpenUnix and BSD. Very little standardisation across shells on those platforms, nightmare to work around the quirks of different machines.

      Let me give another example of shell suckiness. Write a script to run FTP, check the output and if it fails do some rollback action. Perl, easy (Net::FTP). Shell, you run, grep and pray. Again, different platforms, different ftp command line syntax... Linux, Solaris, argh.

      Regards, Peter
      http://perl.dragonstaff.co.uk

        My experiences are different. I once rewrote from scratch an installer/upgrader for the product the company I worked for produced. The result, a 2.5k shell/awk program, using isql (Sybase command line tool) to do its interaction with the database. Initially developed on Solaris, but I knew it had to run on HP-UX and Windows NT as well. The porting took less than 30 minutes - I was using an option to grep HP-UX didn't know about, and Windows NT (which had a Unix toolkit installed) didn't know about /tmp.

        Later, I've used shell scripts initially developped on Linux without much problems under Windows/cygwin.

        Of course, you may say, Windows doesn't come with a Unix toolkit. You are right of course. But Windows doesn't come with Perl either, so whether you want to use Perl or shell to do your cross-platform scripting, you will have to install something.

        Why I didn't write the installer I mentioned in the first paragraph in Perl, you may ask. Well, at that time, Solaris didn't come with perl, so we couldn't assume perl was available. And I hadn't convinced the company I worked for that we should just bundle perl with our product. Yet. A year later, I had to write a different installer. By then, we did bundle perl with our product, and that one was written in Perl. By the time, we had dropped Windows as a platform so I don't know whether it would have worked on Windows.

Re: Unix shell versus Perl
by sundialsvc4 (Abbot) on Feb 19, 2008 at 17:03 UTC

    I generally concur with that opinion, although I would have stated it in far fewer parargraphs and bullet-points and I would counsel you to do the same. (“Preachers” tend to get crucified, even if their message is immortal.)

    Simply stated, I think that “shell scripts” are intended to be just that:   a moderately sophisticated way to tie shell-commands together. Nothing less, and nothing more. Shell-designers weren't trying to invent a general-purpose programming language (like Perl) since such languages already existed. Instead, they gave you the ability to use any command-processor you like, through the simple mechanism of (#!commandname) “shebang.”

    A shop should agree upon a working-standard and then stick with it. But they should make sure that they're using the right tool for a particular job. All you'll get for the wasted-time that you just spent proving that a wrench can be used as a jackhammer is maybe “w00t! w00t!” (While you're perfecting your curious monstrosity, your colleagues are munching fish-n-chips down at the pub.)

      But they should make sure that they're using the right tool for a particular job.
      Using the right tool for the job is important; I've worked at shops where people wrote thousands of lines of C to accomplish what could be done in ten lines of Perl. Yet companies need to set a practical limit on the number of supported languages when they commit to maintaining code in these languages over a period of many years. After all, mastering, as opposed to dabbling in, a language, and its libraries, and its community, takes a lot of time and effort.

      What is a sound practical limit on the number of languages a company can comfortably support? I don't know, and it depends on the company, but my perhaps conservative opinion is that my company should support just one "fast" statically typed language and just one "dynamic" language. Maybe two. Any more than two would be a mistake IMHO. For example, I feel writing part of our system software in D, another part in Haskell, another in Erlang, and another in C++ would be a strategic mistake, even if each was indeed the "right tool for the job". Ditto for writing in a combination of Perl, Ruby, Python, and Lua.

      Update: Even a company as big as Google only allow three languages to be used for production code, namely C++, Java and Python.

Re: Unix shell versus Perl
by DrHyde (Prior) on Feb 19, 2008 at 10:31 UTC
    Writing portable perl scripts is hard too.

      Portable anything is difficult, the question is how difficult.

      If you give me the option of porting a shell script and a Perl script to a win32 environment, I'll take Perl. It's still not a fun job, but at least it isn't a painful job.

Re: Unix shell versus Perl
by starbolin (Hermit) on Feb 20, 2008 at 07:21 UTC

    eyepopslikeamosquito writes:

    "Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Which is why most Unices do not allow a shell script to be setuid."
    Suidperl of questionable status. While some have said it should be considered depricated: http://www.xray.mpe.mpg.de/cgi-bin/w3glimpse2html/perl5-porters/2008-01/msg00949.html

    Others are interested in keeping it alive. Indeed, Perl ships without setuid compiled on some distros.

    A lot of the negatives built into some of the shells can be seen in differing degrees in perl. It's interpreted nature means it can be slow. It's eclectic feature set means it can be difficult to audit. It's cooperative nature means scripts can call questionable binaries. It's comprehensive nature means it's footprint is large. So it would be dangerous to attack shells based on their architecture.

    "Shell scripts, being interpreted and often having to create new processes to run external commands to do work, tend to be slow."

    Perl is also interpreted. Although I think that is not the crux of your argument I would chose different wording.


    s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s |-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,, $|=1,select$,,$,,$,,1e-1;print;redo}
      Perl is not interpreted. At least it is not in the same sense as shell. Shell scripts are source-level interpreted. Perl programs go through a full-scan compilation into another form which is then executed. It isn't translated directly into machine code, but syntax errors are caught at compilation time and not run time. The process does happen generally every time you run a program, but that does not mean it doesn't happen.

      Nearly every useful language has the ability to execute external commands. Most shells rely on that for most of their utility. If you think Perl is insecure because it can call an external binary, then what of shell, which often must?

      Also related to how shell is used is your footprint comment. How is Perl's footprint compared to every Unix command available on every version of Unix? Undisciplined shell programmers can use every CLI-based program on the system to get their work done. They often must use the search path to attempt any kind of portability, and can be effected by other environment differences. An undisciplined Perl programmer who uses every dark corner of Perl can at least expect some compatibility among different installations of the same version. If you're depending on external tools for most of your functionality, you can't count on much of anything.

      Shell doesn't have an eclectic feature set? Again, you're talking about the entire installed command set of whatever machine you're on, often depending on the search path and environment variables.

      Perl has its weaknesses, but I don't think you're assessing them fairly here. Perl and most shells are worlds apart.

      As mr_mischief has already pointed out, Perl scripts are first compiled to an internal form, ensuring syntax errors are caught at compile time rather than run time ... and considerably speeding up script execution at run time.

      To give a specific example of where shell can be slow, consider the common task of running an external command recursively on all files under a given directory. In shell, you could use the find command with its -exec option, but that results in a new process being created for every file. I've measured cases traversing thousands of files where such an approach proved to be hundreds of times slower than using Perl's File::Find module in harness with a Perl function performing the work of the external command. The performance difference is especially noticeable when the external command doesn't do much work. Of course, the way to solve this performance problem in shell is to use find in harness with xargs, but there are portability pitfalls for the unwary, namely when the filenames contain spaces or other unusual characters -- as indicated in a response to a 2002 gnat use.perl.org journal entry. Update: broken link, these are the commands:

      find . -print0 | xargs -0 ls -l (GNU only) find . -print | sed 's/ /\\ /g' | xargs ls -l (works everywhere)
      Found an archived link: gnat use.perl.org journal entry (see response by oneiron)

Re: Unix shell versus Perl
by starbolin (Hermit) on Feb 23, 2008 at 05:53 UTC

    As eyepopslikeamosquito and mr_mischief point out there are important differences between perl's interpreter and others. However I was attempting, albeit poorly, to point out that simply saying "shells are bad because they're interpreted" could backfire if you are trying to sell perl. My concern fell out of some confusion I had when reading the OP as to whether, briefly, he was talking about shells or perl in regards to interpreters.

    As a separate caution, those PHBs unfamiliar with perl can be uneasy with perl's power and diversity. It may be best to soft-sell these points and concentrate on portability and organizational issues where having 'one code to rule' offers clear advantages.

    I apologize that my previous post was cryptic and confusing and I thank eyepopslikeamosquito and mr_mischief for setting me straight. I am well aware, however, of perl's capabilities and I love beautiful perl code like the rest of the Monks here. However not all of perl's features are optimized for systems programing and I find some of it's features to be quite clunky in those applications. I think the Perl Evangelist would profit well in gleaning from fellow Monks which characteristics offer a clear, and easy to communicate, distinction with other shells. The likewise said evangelist would profit well in avoiding those comparisons which could be misconstrued by the casual observer as symantic sugar.


    s//----->\t/;$~="JAPH";s//\r<$~~/;{s|~$~-|-~$~|||s |-$~~|$~~-|||s,<$~~,<~$~,,s,~$~>,$~~>,, $|=1,select$,,$,,$,,1e-1;print;redo}
Re: Unix shell versus Perl
by edignan (Initiate) on Feb 20, 2008 at 23:55 UTC
    I worked a project using KSH script on UNIX machines to execute test scripts against embedded code. Please note, the goal was to test the code. Biggest problems we had were migrating non-Unix users into an "environment" environment. If PERL could get you around the number of wasted man-hours we invested in debugging scripts that worked under one sign-in and not the next due to environment variables.... We burned a lot of man-hours NOT testing the target code, with scripts that "should" have run; all due to a nominally controlled user environment. Is PERL going to help me avoid the difficulties induced by a bad environment to begin with? Will it help you? Emmett
Re: Unix shell versus Perl
by Bloodnok (Vicar) on Aug 05, 2008 at 12:47 UTC
    Aside from the fact that there is, undeniably, only perl for scripting on Windoze machines (if that isn't a contradiction in terms), one thing that, for *NIX machines, hasn't, AFAICT, yet been mentioned is that perl is installed other than on the root partition - thus any system scripts requiring the availability of perl (and the necessary parts of its' myriad of wonderful library modules) cannot start until the appropriate partition(s) has/have been checked and mounted ... but to get to that point perl must be available... mmmm, bit of a circular dependency thingy goin' on here methinx.

    IMO, this is a classic case of horses for courses - shell script is marvellous (C shell excepted of course - Tom Christianson. http://www.perl.com/pub/language/versus/csh.html) for *NIX systems programming - for all else, perl is, by far & away, the front runner.

    Just my 10 p'worth...

    At last, a user level that overstates my experience :-))
      Well, that quite depends on the setup. Many Linux distros use one big partition by default anyway (or at least have /usr on the root mount), so once you have anything, you have perl (assuming it's in /usr/bin/perl). And traditionally, the root mount contained /bin with just a small set of programs, just enough to get to the stage to mount other file systems. Anything interesting would be in /usr/bin anyway. Furthermore, mounting other file systems happens pretty early in the boot process anyway - so most rc scripts will have perl available, even if perl is not on the root mount.

      But I've also worked for a company where we used Linux boxes that only had 2 Mb of memory, 25 Mb disks, and for which the OS had to be installable from a single floppy disk. Needless to say, said boxes didn't have Perl, although I used Perl a lot to create the distros.

        ...there's an awful lot "goes on" after getting to single user run level...i.e. before the rest of the file systems are checked & mounted.

        A user level that continues to overstate my experience :-))
Re: Unix shell versus Perl
by JavaFan (Canon) on Aug 05, 2008 at 14:32 UTC
    Let me first say that I almost always prefer Perl over the shell as well. But I don't always agree with your arguments; they are a bit too black and white for me.

    Writing portable shell scripts is hard. I don't think it's hard, although it's easy to write something that is not portable. But you have several options available to make porting easier. First is to limit yourself to the POSIX standard. Most Unix vendors support at least POSIX complaince for their shell tools. The second option is to use the GNU tools, and install the GNU tools on all the platforms your program needs to run on. GNU tools have been ported to all major Unix platforms, and most run on Windows as well. Is each external command available on each of our platforms? That's a valid question to ask when porting a shell program, but an equally valid question is to ask whether each Perl module you use is available on each of the platforms you use. This may usually be true for pure Perl modules, but is not always true for XS modules. Specially not if said modules use third party libraries.

    Shell scripts tend to be fragile. Running commands and scraping their output is an inherently fragile technique, breaking when the format of the command output changes. My experience is that the behaviour of shell commands tends to change less than the behaviour of Perl modules. Sure, a shell command may output something different if you upgrade it, but a Perl module may change its behaviour as well if you upgrade it. And Perl modules don't have a standard - shell programs do: POSIX. And then there's perl itself. In the 25+ years of programming that I do, I cannot recall a program breaking because of an upgrade of the shell. An upgrade of Perl always tends to break at least one of my programs in some way.

    Shell scripts tend to be slow. That's 'the pot calling the kettle black', isn't? Agreed, the shell doesn't shine when it comes to speed, but neither does Perl, does it? Luckily, with modern machines, it's usually the disk or the network (or the memory) that's the bottleneck, so the relative slowness of the language itself doesn't really matter. But in the few cases where it did, I'm always amazed by the speed of C compared to Perl (every now and then I rewrite a Perl program in C for speed reasons).

    Shell scripts tend to be insecure. Running an external command in a global environment is inherently less secure than calling an internal function. Eh, not necessarely. Languages don't make programs (in)secure. Programmers do. And frankly, I have more trust in what my vendor puts in /usr/bin than what I download from CPAN. I do not agree that running an "external command" is "inherently" less secure than calling an "internal function". And I've no idea what you mean by "global environment".

    As a programming language, shell is primitive. Shell does not have namespaces, modules, objects, inheritance, exception handling, complex data structures and other language facilities and tools (e.g. Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, ...) needed to support programming in the large. Up to version 5, Perl didn't have namespaces, modules, objects, inheritance nor exception handling. And C still doesn't have them. And depending what you mean by 'complex data structures', C doesn't have them either. As for Perl::Critic, Perl::Tidy, Devel::Cover, Pod::Coverage, I disagree that they are needed - in fact, I wouldn't want to use Perl if they were needed to program in Perl. And in the history of Perl, they are the new kids on the block. People prefered perl over shell long before those tools were first created.

    If you're sure the script will "never" need to be ported to Windows. Been, there, done that. Shells (and shell like tools) have been ported to Windows, just as Perl has. And remember the time the perl port on Windows was different from the one on Unix? At that time, that wasn't true for some of the shells. "Oneperl", the perl that merged the Unix and Windows ports only dates from 1998; macperl merged even later than that (and that only because MacOS went Unix).

      I do not agree that running an "external command" is "inherently" less secure than calling an "internal function". And I've no idea what you mean by "global environment".
      By global environment, I was referring to environment variables (e.g. PATH, IFS, CDPATH, ENV, BASH_ENV, SHELL, TZ, LD_LIBRARY_PATH) and other elements of the execution environment (e.g. umask, inherited file descriptors, temporary files) that are a common source of exploits by malicious attackers. Certainly, executing an external program securely is not trivial: there are many, many security exploits to consider and guard against. That's why I stated that calling an internal function was inherently more secure -- because all these many and varied exploits need not be considered.

      To give a specific example, most shell scripts tend to use the (potentially insecure) $HOME and $SHELL environment variables to ascertain a user's home directory and shell, while a Perl script can get this information via the more secure (and more reliable) getpwnam internal function.

      That shell scripts tend to be insecure is widely known and acknowledged; see, for example, FAQ: How can I get setuid shell scripts to work? and perlsec, which opens with:

      Unlike most command line shells, which are based on multiple substitution passes on each line of the script, Perl uses a more conventional evaluation scheme with fewer hidden snags. Additionally, because the language has more builtin functionality, it can rely less upon external (and possibly untrustworthy) programs to accomplish its purposes.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://668481]
Approved by McDarren
Front-paged by clinton
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others imbibing at the Monastery: (8)
As of 2024-04-18 15:22 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found