The first thing I ever posted here was the first real perl program I had ever "cronned" on my system. It was a useful little script to backup directories and allowed for exclusion of file/sub directories.. well I tweaked it a bit recently to be much nicer looking and alot more functional as far as reporting info to the user of the program.. so here's the new backup.pl that's running on my system:
#!/usr/bin/perl -w # # Script used to back up files to another # partition. # # Code by Brad Lhotsky <brad@divisionbyzero.net> # $|++; use strict; ########################### # Configs: # my $MOUNT = "yes"; my $PARTITION = "/dev/hdd1"; my $MOUNTPOINT = "/mnt/tmp"; my $BACKUPDIR = $MOUNTPOINT; my $CONFIG = "/root/backup/backup.conf"; my $ERRORFILE = "/root/backup/backup.err"; die "config file $CONFIG couldn't be found: $!\n" unless -e $CONFIG; my %TARS=(); unlink($ERRORFILE) if -e $ERRORFILE; if($MOUNT) { system("mount $PARTITION $MOUNTPOINT"); } open(CFG, "<$CONFIG") or die "couldn't open $CONFIG: $!\n"; while(local $_ = <CFG>) { chomp; s/^\s+//; s/\s+$//; next if /^#/; next if !$_; (local $_, my @JUNK) = split /#/; s/^\s+//; s/\s+$//; my ($tarfile,$directory,@EXCLUDES) = split; push @EXCLUDES,'..'; push @EXCLUDES,'.'; my $ok = &backup($tarfile,$directory,\@EXCLUDES); if($ok) { print "Tarred $tarfile successfully! ($directory)\n"; $TARS{$tarfile}=1; } else { print "Tar of $directory to $tarfile failed!\n"; } } foreach my $tarfile (keys %TARS) { &compress($tarfile); } if($MOUNT) { system("umount $PARTITION"); } sub backup { my ($tarfile,$dir,$exc) = @_; return 0 unless $tarfile && $dir && $exc; if($dir !~ /\/$/) { $dir .= '/'; } if($tarfile !~ /^\//) { $tarfile = $BACKUPDIR . "/$tarfile"; } my $check = "$dir/.tar"; return 0 unless -e $check; if(-e $tarfile) { return 0 unless -d $dir; opendir(LS, $dir) or die "couldn't open $dir for listi +ng: $!\n"; while(local $_ = readdir(LS)) { #chomp; my $skip=0; foreach my $file (@$exc) { $skip=1, last if ($_ eq $file); } next if $skip; my $file = $dir . $_; my $append = "tar -rf $tarfile $file 2>> $ERRO +RFILE"; system($append); print "\t$file\n"; } print "$tarfile appended to!\n"; } else { my $create = "tar -cf $tarfile $check 2>> $ERRORFILE"; print "$tarfile created!\n"; system($create); &backup($tarfile,$dir,$exc); } } sub compress { my $tarfile = shift; if($tarfile !~ /^\//) { $tarfile = $BACKUPDIR . "/$tarfile"; } my $gzipit = 'gzip -f ' . $tarfile . '.gz ' . $tarfile . " 2>> + $ERRORFILE"; print "GZipping $tarfile .... "; system($gzipit); print "done!\n"; }

I know its nothing great, but compared to the old one, its definately a step up.

feel free to go "WTF ARE YOU DOING THAT FOR????? DO THIS:", I always welcome criticism, its the only way to get better.
thanks to everyone who's helped me in the past.

UPDATED
I took most fo your suggestions, haven't had time to download/install any modules to make this look nicer..
thanks for your input ;)
-brad..

Replies are listed 'Best First'.
(Ovid) Re: Evolution of a Perl Programmer (new backup script)
by Ovid (Cardinal) on Dec 21, 2000 at 00:11 UTC
    Yes, this code is nicer than your first attempt, but there is still some room for improvement. Here are just a couple of things that I noticed (this is just glancing at the code and not really going into it too deeply).
    system("rm $ERRORFILE") if -e $ERRORFILE;
    Why not just use unlink $ERRORFILE? System calls have a fair amount of overhead and I'm not clear why you would use this instead of a built in function.

    Also, you really want to use File::Find for traversing through directories.

    Cheers,
    Ovid

    Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.

      Another good reason to use unlink() is to avoid the chance that somebody will subvert your script by slipping an executable named rm into your path.

      You could, of course, mitagate this by writing

      system("/bin/rm", $ERRORFILE) if -e $ERRORFILE;
      
      Notice also that I reworked your use of system() to pass two arguments rather than one. This avoids the often unavoidable, albeit small, overhead of having system() check for shell metacharacters.

      Better, I think, to write

      unlink($ERRORFILE) or die "$ERRORFILE: $!"
          if -e $ERRORFILE;
      
Re: Evolution of a Perl Programmer (new backup script)
by KM (Priest) on Dec 21, 2000 at 00:51 UTC
    system("rm $ERRORFILE") if -e $ERRORFILE;

    Why not unlink?

    Why not File::Find?

    Why no -w?

    Why don't you check to see if the dir is already mounted before mounting?

    Why don't you use Archive::Tar/Compress::Zlib/etc.. for compressing.

    And, personally, when I have something on a cron and use any type of config file, I like to make sure that noone is editing it when the script starts to run. I usually use a shell script which is called by cron that is something like:

    #!/bin/sh r=`ps -eaf | grep <name of script> | grep -v grep | grep -v vi` if [ $? -eq 1 ] then /run/the/script else sleep 10 /re/run/this/script fi

    This way, if the script is running, it is being edited, or if the conf file is being used (assumming foo.pl has foo.conf, and you have <name of script> as 'foo') it will wait. YMMV.

    But, much better than your original. It is always exciting to see yourself (and others) progress. Good show!

    Cheers,
    KM

Re: Evolution of a Perl Programmer (new backup script)
by mothra (Hermit) on Dec 21, 2000 at 08:45 UTC
    I'll avoid repeating what's already been said. As a point of note though:
    foreach my $file (@$exc) { if($_ eq $file) { $skip =1; } }

    is broken, since it'll keep looping through checking for exclusions even if it's already found that this file or directory is in fact excluded from being backed up. Realistically, in a small, simple backup script like this, run on your own computer most of the time, it probably wouldn't matter. But it IS generally a good idea to leave a loop when you no longer need it to execute. :)

    foreach my $file (@$exc) { if($_ eq $file) { $skip =1; } last if $skip; }
      Don't test an exit flag, just set the flag and exit.
      foreach my $file (@$exc) { $skip=1, last if ($_ eq $file); }

      --
      $you = new YOU;
      honk() if $you->love(perl)

        Ah, but the only reason for the flag is to immediately skip to the next iteration of the while loop, which can be done without a flag at all:
        FILE: while (defined(local $_ = readdir(LS))) { foreach my $file (@$exc) { next FILE if ($_ eq $file); } my $file = $dir . $_; my $append = "tar -rf $tarfile $file 2>> $ERRORFILE"; system($append); print "\t$file\n"; }
        Although I would use a hash for the excludes, instead of looping over an array for each file in the directory, especially since the most common case of a file not being in the exclude list would also be the slowest case.

        I also recommend testing the readdir() assignment with defined, as I did above, so that the while loop will not stop prematurely if there's a file named '0'.

        Thank you!!!