in reply to Re^2: Using Crypt::OpenPGP with large files
in thread Using Crypt::OpenPGP with large files
If the module already calls /usr/local/bin/gpg externally, then you have a different problem. You need to be sure that you are not creating a new Crypt::OpenPGP object for each file, otherwise you will be causing alot of overhead. What you want to do is create 1 Crypt::OpenPGP object and reuse it. You must clear out all the old data in the object after each use. See
So now you can reuse that same opening of /usr/local/bin/gpg, which will speed things up and reduce memory usage. You would still have 1 memory problem however, because the perl object's memory usage will swell to the largest file size it encounters.7.27: How do I clear a package? Use this code, provided by Mark-Jason Dominus: sub scrub_package { no strict 'refs'; my $pack = shift; die "Shouldn't delete main package" if $pack eq "" || $pack eq "main"; my $stash = *{$pack . '::'}{HASH}; my $name; foreach $name (keys %$stash) { my $fullname = $pack . '::' . $name; # Get rid of everything with that name. undef $$fullname; undef @$fullname; undef %$fullname; undef &$fullname; undef *$fullname; } } Or, if you're using a recent release of Perl, you can just use the Symbol::delete_package() function instead.
So you may have to filter your files for size, to decide to feed them directly to the binary, or use the reusable module.
There is also the possibility of running parallel processes in threads if you have a multicore computer. But that is a whole different thread on it's own, running gpg thru threads.
1 last bit of advice, run all your decoding on a ramdisk if it is to be kept private. :-)
|
|---|