in reply to Re^2: scripting tar
in thread scripting tar

Is there a specific reason why you would have to use perl for this

Well, that line was pulled from a more complex script, that is far easier to do in Perl, than something like bash shell. For instance, what if you were automating writing the tarballs to a cd/dvd, or transferring it over a socket to another location, forking the work off, etc. Additionally, file and directory operations are soooo much easier with Perl, than shell, especially if you are used to Perl and abhor shell syntax.

So....it is often better to use Perl , and run the occaisional command thru system/backtick/pipe, and have the full power of Perl available when needed for things shell dosn't do well.


I'm not really a human, but I play one on earth Remember How Lucky You Are

Replies are listed 'Best First'.
Re^4: scripting tar
by slacker (Friar) on Jan 31, 2009 at 22:09 UTC
    My intention is not to hijack the above thread but.... Would you mind providing an example where using Perl is easier to manipulate files and directories as opposed to using a shell? Unless i'm misunderstanding, I would almost always prefer to use mkdir,chmod, chown, cat or whatever platform you're using, over using Perl to do the same thing. Thanks.
      Would you mind providing an example where using Perl is easier to manipulate files and directories as opposed to using a shell

      Naw, I'm too lazy today. Suffice it to say, if shell was so easy, Perl( and other scripting languages) would never have gained so much popularity. That is proof enough.


      I'm not really a human, but I play one on earth Remember How Lucky You Are
        Well as far as modifying permissions and basic sys admin tasks, yes, shell is easy. I would tend to agree, it does have its limitations though. I was just hoping for an example of one of those limitations being done better in Perl.

      I have an upload script that uses rsync over ssh to upload files to my webhost (now that I've finally moved to one that gives me ssh access, woo-hoo!). I first wrote it as shell. It went something like this:

      #! /bin/sh cd $(dirname $0)/site # clear out garbage find . -name '*~' | xargs rm # what do we want to sync? sync=$(ls | grep -v perllib | grep -v \\.xoops_) # what do we not want to prune remotely? remote_keep="*.ttc *.xoops_*" remote_keep=$(for x in $remote_keep; do echo "--exclude=$x"; done) # sync. rsync -avz "$@" --delete -e 'ssh -l myuser' $remote_keep $sync myhost. +com:
      Where this got ugly was simply keeping the list of local excludes for uploading, and remote excludes for pruning. Those are ugly. Instead, I have:
      #!/usr/bin/perl use File::Spec; use FindBin; # make sure we're in the right directory. chdir File::Spec->catdir($FindBin::Bin, 'site'); # clear out garbage. use File::Find; find( sub { /~$/ and unlink }, '.'); # directories we want to sync... my @sync = grep { -d $_ and !/perllib/ and !/\.xoops_/ } glob '*'; # remote directories we don't want to prune... my @remote_keep = map { '--exclude=' . $_ } qw(*.ttc *.xoops_*); # sync... system(#qw(/bin/echo), qw(/usr/bin/rsync -avz), @ARGV, qw(--delete -e), 'ssh -l myuser', @remote_keep, @sync, qw(mysite.com:) );
      I can move the lists to the top of the script, and make it really easy to add new excludes without having to put in a bunch of --exclude='s. Meanwhile, what I have locally actually works (with the extra modules installed locally in perllib), but I don't need/want that uploaded (since there are some XS modules, I want to build them on the remote host - I have a CPAN directory that I upload, too, with all the tarballs of modules I want to use).

      The shell performed adequately. The perl version uses only three processes (perl and rsync and its ssh), and is easier to add more stuff to, IMO.

Re^4: scripting tar
by FredKJ (Novice) on Feb 02, 2009 at 19:10 UTC
    That's exactly why I need Perl instead of shell scripting. It has to by highly dynamic and automated taking information from a config file for the files to tar.