in reply to Re^3: scripting tar
in thread scripting tar

My intention is not to hijack the above thread but.... Would you mind providing an example where using Perl is easier to manipulate files and directories as opposed to using a shell? Unless i'm misunderstanding, I would almost always prefer to use mkdir,chmod, chown, cat or whatever platform you're using, over using Perl to do the same thing. Thanks.

Replies are listed 'Best First'.
Re^5: scripting tar
by zentara (Cardinal) on Jan 31, 2009 at 22:15 UTC
    Would you mind providing an example where using Perl is easier to manipulate files and directories as opposed to using a shell

    Naw, I'm too lazy today. Suffice it to say, if shell was so easy, Perl( and other scripting languages) would never have gained so much popularity. That is proof enough.


    I'm not really a human, but I play one on earth Remember How Lucky You Are
      Well as far as modifying permissions and basic sys admin tasks, yes, shell is easy. I would tend to agree, it does have its limitations though. I was just hoping for an example of one of those limitations being done better in Perl.
        Well as I was dreaming last night, I felt bad about not answering better. So I thought up a typical real world problem that one might face.

        The problem:

        You must search thru a list of subdirs, defined by a specific alpha-numeric pattern. In each of those subdirs, you must locate a specific file (defined by another alpha-numeric pattern), and search thru that file for a specific name(s), given on the commandline as options. You must include logic to account for abbreviation of the middle initial/name. When you find those names, extract an account number, and save them. Next, go to another list of subdirs( as defined by another alpha-numeric pattern) and search all files for that account number. If that account number(s) is found, add the file to a backup tarball, then delete the account number from the file, and save the resulting file to another tarball, in a directory named based on the account number. The tarballs cannot exceed 500 megs, and must named sequentially. When done with 1 in a sequence, transfer it thru SFTP to another computer, and email the account manager.

        :-)

        Now, I'm sure that this can be done with a multi-line shell script full of pipes , sed, % signs, echo, and grep; but you would need one heck of alot of shell expertise, and most people would not be able to follow it's logic, with all the pipes.

        In Perl however, a clear script could be written, with intermediate arrays available for debugging printouts, etc.

        I will concede that shell can be faster and simpler for some simple operations, BUT in general, it is hard to follow, and full of hard_to_decipher option lists and pipes, for most real world tasks. Sure, just recursively tarring up a directory is simple with the shell, but it's the other things that are usually needed, that cast shadows over shell and make Perl shine.


        I'm not really a human, but I play one on earth Remember How Lucky You Are
Re^5: scripting tar
by Tanktalus (Canon) on Jan 31, 2009 at 22:37 UTC

    I have an upload script that uses rsync over ssh to upload files to my webhost (now that I've finally moved to one that gives me ssh access, woo-hoo!). I first wrote it as shell. It went something like this:

    #! /bin/sh cd $(dirname $0)/site # clear out garbage find . -name '*~' | xargs rm # what do we want to sync? sync=$(ls | grep -v perllib | grep -v \\.xoops_) # what do we not want to prune remotely? remote_keep="*.ttc *.xoops_*" remote_keep=$(for x in $remote_keep; do echo "--exclude=$x"; done) # sync. rsync -avz "$@" --delete -e 'ssh -l myuser' $remote_keep $sync myhost. +com:
    Where this got ugly was simply keeping the list of local excludes for uploading, and remote excludes for pruning. Those are ugly. Instead, I have:
    #!/usr/bin/perl use File::Spec; use FindBin; # make sure we're in the right directory. chdir File::Spec->catdir($FindBin::Bin, 'site'); # clear out garbage. use File::Find; find( sub { /~$/ and unlink }, '.'); # directories we want to sync... my @sync = grep { -d $_ and !/perllib/ and !/\.xoops_/ } glob '*'; # remote directories we don't want to prune... my @remote_keep = map { '--exclude=' . $_ } qw(*.ttc *.xoops_*); # sync... system(#qw(/bin/echo), qw(/usr/bin/rsync -avz), @ARGV, qw(--delete -e), 'ssh -l myuser', @remote_keep, @sync, qw(mysite.com:) );
    I can move the lists to the top of the script, and make it really easy to add new excludes without having to put in a bunch of --exclude='s. Meanwhile, what I have locally actually works (with the extra modules installed locally in perllib), but I don't need/want that uploaded (since there are some XS modules, I want to build them on the remote host - I have a CPAN directory that I upload, too, with all the tarballs of modules I want to use).

    The shell performed adequately. The perl version uses only three processes (perl and rsync and its ssh), and is easier to add more stuff to, IMO.