zzspectrez has asked for the wisdom of the Perl Monks concerning the following question:

Hello, fellow perl monks. I have a question for the wise ones! I have a small mail server I use to supply email for familly, friends and myself. Nothing serious, so I have never really had any true backups. The server I can setup quickly since it is bare bones setup just to do mail and act as a firewall. Only thing important on it is the email messages and some webpages.

I want to write a perl script to automate the process. I have written a preliminary script to backup selected mail accounts to a zip drive.

Questions I have:

  1. Can I run this from cron without opening some security hole on my server
  2. I was unsure about how to be sure that Im not backing up the file while the mail program (EXIM) is updating the spool. I checked exims docs it doesnt mention anything. Do I just need to use flock??
  3. Any suggestion!
Here is the code:

#!/usr/bin/perl -w # simple script to backup specified mail accounts # to a zip disk. use strict; my $zip_drive = "/dev/hdc4"; my $mnt_dir = "/mnt"; my $tmp_dir = "/tmp/"; my $bkup_root = "/var/spool/mail/"; my @bkups = ( 'kirk', 'spock', 'mrmister', 'you' ); my $todays_date = join '-', (('jan','feb','mar','apr','may','jun', 'jul','aug','sep','oct','nov','dec')[(localtime)[4]]), ((localtime)[3]), ((localtime)[5]+1900); $ENV{PATH} = "/bin:/usr/bin"; print "Mounting zip drive for backup...\n"; die "Couldn't mount zip drive: $?\n" if system ("mount","-t","vfat",$zip_drive,$mnt_dir); my $old_dir = `pwd`; chomp $old_dir; chdir "$bkup_root" or die "Can't change working directory to [$bkup_root]: $!\n"; foreach my $bkup (@bkups){ my $arch = "$tmp_dir$bkup-$todays_date.tar"; if (system("tar","-cvvf",$arch,$bkup)){ warn "Could not tar file [$bkup]:$!\n"; next; }else { print "File has been tarred.\n"; if (system("gzip","-9",$arch)){ warn "Could not gzip file [$arch]:$!\n"; }else { print "File has been gziped.\n"; if (system("mv","$arch.gz","$mnt_dir/mail/")){ warn "Could not move file from temp directory", " to zip drive: $!\n"; }else{ print "File has been moved to zip drive for" " backup\n"; } } } } chdir $old_dir or die "Can't restore working directory to [$old_dir].\n"; print "Unmounting zip drive...\n"; die "Couldn't unmount zip drive: $?\n" if system ("umount",$mnt_dir); print "\nZip disk can be removed.\n";

Thanks for any suggestions!!
zzspectrez

Replies are listed 'Best First'.
Re: Using perl to automate mail backup
by AgentM (Curate) on Nov 12, 2000 at 09:40 UTC
    a quick root crontab is the best solution for making backups. I'm used to tarred files on tape drives- on random access units such as the zip drive, you need to either adopt a naming convention or simply cat to the tar (which is insanely easy with tar) to end up with one file which is most efficient. For a timed daemon, this program is unusually verbose. That's not necessary but for debugging. You might also want to confirm at least the Zip diskette's title before you add files to it, since at 12:00am, you might be writing your english essay when the backup kicks in and deletes it! Because of how tar magic works, you will be able to extract certain sections with no problem, especially if you switch from system() to Archive::Tar which will simplify the interface. The module actually has add_files() and add_data() functions! Also, mounting and unmounting your diskette is not necessary if the drive is all that the backup is used for.

    I also noticed that you are not checking for many errors. While you are checking for errors, you will never be notified of such errors if this is a daemon job. if the backup is unsuccessful, I would hope that it would at least mail you or add a syslog(). Also, you will never know when the drive is full. If it is full, it should notify you immediately so that it can continue with the backup.

    You should quickly check to see if the diskette is the correct one for backup. Perhaps a small file with the title "Email Backup Disk 1" would do the trick.

    In short, daemons should definitely be equipped with an emergency mechanism for failure, such as syslog or wall for more important things.

    flock: flock does not implement mandatory locking. If your mail spooler doesn't use flock, then you can't either. If your system supports mandatory file locking, then by all means use it- unless it screws with the spooler....but you're just reading files, whether incomplete or not. A ten second suspension of sendmail is probably out of the question :-) the next backup would catch the next round of data....It's apparent to me that this is not incredibly important so your workarounds would be OK...

    AgentM Systems nor Nasca Enterprises nor Bone::Easy nor Macperl is responsible for the comments made by AgentM. Remember, you can build any logical system with NOR.

      Im not sure I gather what you mean a naming convention or simply cat to the tar. The tar files are created using a naming convention username-month-date-year.tar. I know it would be better to just tar /usr/spool/mail but I have a few mailboxes that are quite large (IMAP SERVER) and dont want to backup as frequently. Im going to modify the script to backup a few additional things (a few user directories, etc) so will probably have it tar something like:(PSEUDOCODE)

      cd /var/spool/;tar -cvf /tmp/mail-bkup.tar $name1 $name2 .. cd /home; tar -cvf /tmp/user-bkup.tar $user1 $user2 .. cd /tmp; tar -cvf backup-$todays_date.tar mail-bkup.tar user-bkup.tar

      Good point about checking the zip disk, I didnt think about that. The backup has been done, Just dont know on which disk!

      Im not using modules such Archive::Tar because I am trying to better understand using perl as a glue language. I come from windows background and need the experience playing with the unix tools.

      Currently the err checking is laughable, I know. This is just a test script that Im using from the command line and want to see whats going on. I think Ill take your sugestion and have it mail me any errors. How could I grab errors reported by tar and gzip? Can I grab standerr somehow?


      Thanks!
      zzspectrez

        If you open tar and gzip with IPC::Open3 or IPC::Run, then you'll be able to capture and use ALL 3 STD* pipes. I might reconsider using the Perl modules, though. While you might be able to mail yourself the STDERR from the failed tar, it might not always be obvious what happened. With the perl modules, this is less so. You should at least pick up on $? and turn on verbose mode on the tar and gzip.
        AgentM Systems nor Nasca Enterprises nor Bone::Easy nor Macperl is responsible for the comments made by AgentM. Remember, you can build any logical system with NOR.
Re: Using perl to automate mail backup
by Fastolfe (Vicar) on Nov 12, 2000 at 09:48 UTC
    If you're worried about security, the first thing I'd do is use taint checking (-T). See perlsec for the usual caveats. It looks like you're doing a good job of keeping a watch out for potentially unsafe data though.

      I originally did, but I wanted to be able to have the working directory reset to its original value when the program exits. The only way I know to do this is my $old_dir = `pwd`, and the output of the pwd program causes $old_dir to be tainted.

      From what I can gather, the only proper way to untaint data is to using a regular expression. Something along the lines of this I believe: my ($untainted) = $old_dir =~ /^(.*)/. Then I could use chdir $untainted without getting a warning of an insecure dependency in chdir. If I understand it correctly, this is how you convince perl that you have filtered the input of any harmfull input. Will my code above cause some insecurity because of it blindly untainting the input of pwd??

      Thanks for the link to the perlsec. I have looked it over a few times but it is one document that I need to read over more thouroughly. I am not a security expert, that is for sure!

      Thanks!
      zzspectrez

        I'm not sure what you mean by resetting the current working directory. When your script exits, its current working directory disappears with it, just like its environment.
        (fastolfe) eddie:~$ perl -e 'chdir("tmp"); system("pwd");' /home/fastolfe/tmp (fastolfe) eddie:~$ pwd /home/fastolfe
        There is also the Cwd module, which will fetch the current working directory (via getcwd or cwd). I don't know if/how these values are tainted though.