in reply to backup script runs out of memory

Wouldn't it be easier just to use "tar" (the GNU version is the accepted standard, and it is available for all common platforms).

If a perl wrapper makes it more comfortable for you, here's a version of your script that is functionally equivalent(*) to the OP, but is a lot easier (and will run a lot faster, without using very much memory at all):

#!/usr/bin/perl -w =head1 NAME backup.pl - Yet Another Script for making backups =head1 SYNOPSIS backup.pl --bakdir=s --backup=s [--ignorefile=s] Options: --bakdir - where to look for and store backup files --backup - what directory to backup --ignorefile - file which lists what subdirs not to backup =cut use strict; use warnings; use English; use Getopt::Long; use Pod::Usage; use File::Path; my $bakdir = ''; my $backup = ''; my $ignorefile = ''; GetOptions( 'bakdir=s' => \$bakdir, 'backup=s' => \$backup, 'ignorefile=s' => \$ignorefile, ); $bakdir ||= "."; $backup ||= "."; if ( $bakdir eq $backup ) { warn "We should not create a backup of $backup in $bakdir\n"; pod2usage(1); exit; } eval {mkpath($bakdir)}; if ($@) { warn "Unable to find or create directory $bakdir\n$@\n"; pod2usage(1); exit; } # create a tar.gz archive with a unique filename my @t = reverse((localtime)[0..5]); $t[0]+=1900; $t[1]++; my $t = sprintf("%4u-%02u-%02u-%02u-%02u-%02u",@t); my $newbackup = "$bakdir/$t.tar.gz"; my @cmd = qw/tar cz/; push @cmd, '-X', $ignorefile if ( length( $ignorefile ) and -f $ignorefile and -r _ ); push @cmd, '-f', $newbackup, $backup; exec @cmd;

(* footnote: it's not exactly equivalent -- I felt compelled to add a couple checks on the ARGV option values; more could still be done in this regard...)

For some reason, pod2usage did not work for me as expected, but with proper command-line args, this version does accomplish everything that the OP set out to do.

(updated code to add one more check on the "ignorefile" arg, and to remove the "debug" option, since it's not really needed here)

Replies are listed 'Best First'.
Re^2: backup script runs out of memory
by Anonymous Monk on Aug 09, 2006 at 08:24 UTC
Re^2: backup script runs out of memory
by gri6507 (Deacon) on Aug 09, 2006 at 12:51 UTC
    Thanks. I should have looked at the man page for tar to see if it could have used the ignorefile straight up.

    I do, however, have a question about this line
    if ( length( $ignorefile ) and -f $ignorefile and -r _ );
    I understand the first two checks to see if the argument is specified and if that file exists. But what is the third check? I thought that -r should test if the file is readable. But what then is the _?

      But what then is the _?

      Look up "perldoc -f -X"; the underscore means "get this information from the same stat data structure that we used the last time", which saves a few ops. "-r _" assures that the file we just checked with "-f" is readable by the current user.