Hello Magnolia25,
If you want to implement this yourself, you should look at the make_path function in the core module File::Path, and consider testing files either with one of the file test functions (such as -A for access time) or (better) with a module such as Digest::MD5 to determine which files need updating. (If you use MD5 you will need to store the known file hashes. A lightweight database such as DBD::SQLite will be useful here.)
But why reinvent the wheel when a module such as Dackup is available on ? As a test, I created a text file in a suitable directory structure and then ran the following script (adapted directly from the documentation):
#! perl use strict; use warnings; use Dackup; my $source = Dackup::Target::Filesystem->new ( prefix => '.\tmp\home', ); my $destination = Dackup::Target::Filesystem->new ( prefix => '.\home', ); my $dackup = Dackup->new ( source => $source, destination => $destination, delete => 0, dry_run => 0, verbose => 1, throttle => '1Mbps', ); $dackup->backup;
I then edited the text file and re-ran the script to verify that it copied the changed file to the destination (backup) location. You may need to experiment a bit to determine whether it meets your backup criteria.
Hope that helps,
| Athanasius <°(((>< contra mundum | Iustus alius egestas vitae, eros Piratica, |
In reply to Re: Copy directory structure and only latest changes in them when re-run
by Athanasius
in thread Copy directory structure and only latest changes in them when re-run
by Magnolia25
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |