in reply to Perl backup script

my (@file_change,@backup); my ($time2,$change,$file,$files);

You should declare variables as close as possible to where there are defined instead of all at the beginning.



open my $report, ">", "report.txt";

You should always verify that the file opened correctly before trying to use a possibly invalid filehandle.



for $file (</tmp/*>) {

The files in /tmp are temporary, why would you want to save them?



@file_change = stat($file); $time2 = $file_change[9]; # (day = 86400, week = 604800) $change = (time - $time2); push @backup, $file if ($change < 604800) ;

If you just want time periods based on days then you could use the -M operator.



# function to create tarball and backup changed files; sub backup { $files = "@backup"; system ("tar -czvf backup.tgz $files") ; } # run backup function backup();

Why does this need to be a subroutine?    You should verify that system succeeded.

Replies are listed 'Best First'.
Re^2: Perl backup script
by jpl (Monk) on Apr 12, 2011 at 09:55 UTC

    The most important thing you can do about backups is to think hard about the "threat model" you are worried about. The #1 reason I want backups is accidental removal or destruction of a file or files I am actively modifying. The #2 reason is loss of an entire disk. The best defense against #1 losses is a good version control system. I recommend git, but there are many others. What version control systems do well, and tar/rsync do poorly, is keeping a comprehensive history of the changes you make, and calling attention to changes you might not have realized you made. And you decide the appropriate moments at which to commit changes. tar/rsync may catch a file in the process of changing, when a backup is worthless.

    With a good version control system protecting against #1 errors, tar or rsync can focus on #2 errors. I back up entire file systems, and I back them up to different drives on different days. The first time you rsync an entire file system, it takes a long time. Thereafter, it is remarkably good at detecting what has changed, and updating those things quickly. Losing an entire building (fire, flood, earthquake, etc) can destroy all your backups. Here, again, I find git to be useful. I can clone a work project on my home machine, and do occasional "pulls" to keep an offsite copy of what is most important.