demichi has asked for the wisdom of the Perl Monks concerning the following question:
I have written a perl script which archives (tar + zip) various directories on Windows 2008 R2 64bit. The server has 8GB of memory.
This is how I get all files which need to be archived:
# Get all export and log files and command file to tar it. local *push_files = sub { push(@all_files,$File::Find::name); LOG_MESS +AGE ("### ARCHIVING $File::Find::name\n"); }; # nested sub needed as use warnings complains if not # this from module file::find find(\&push_files, ($par_COMMAND_TXT,$par_EXPORTTOOL_OUTPATH,$par_EXPO +RTTOOL_LOG));
This is how I do the archiving:
my $tar = Archive::Tar->new(); $tar->add_files(@all_files) or die LOG_MESSAGE("!!! Can't run program: + $!\n"); # This is needed for older Perl version => otherwise "COMPRESS_GZIP" B +areword warning no strict; $tar->write( "$ARCHIVE_FILE", COMPRESS_GZIP) or die LOG_MESSAGE("Can't + run program: $!\n"); LOG_MESSAGE ("### Archiving finished");
This works in general fine but when there are to much data to archive (around 1.5 GB) I get the error message: "OUT OF MEMORY" and the script stops. When I check the task manager I can see that this happens when the perl process has around 3.2 GB.I did a test with the windows tool 7zip and this works find and the maximum memory this tool is using is around 140MB.
Now my question: What can I do that I do not get this "out of memory"?
Is there a better way to do the archiving to save memory?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Archive::tar - out of memory - 3,2GB
by Corion (Patriarch) on Jun 13, 2014 at 09:06 UTC | |
by demichi (Beadle) on Jun 13, 2014 at 09:23 UTC | |
by Corion (Patriarch) on Jun 13, 2014 at 09:30 UTC | |
by demichi (Beadle) on Jun 13, 2014 at 12:14 UTC | |
by Corion (Patriarch) on Jun 13, 2014 at 13:00 UTC | |
|
Re: Archive::tar - out of memory - 3,2GB
by kevbot (Vicar) on Jun 14, 2014 at 04:51 UTC |