in reply to Industrial strength archiving

Unfortunatelly 'Industrial strength archiving' means non-perl, (well, could be perl if someone would sit and create some libs and utils...)

Why? Because heavyweigth archiving means bypassing(at least partly) filesystem structure (i.e. - not "open /, list all dirs, open every dir, list all files, archive them... etc...", but: linearly walk bytestream and archive what's marked for archiving ).

At this point in time only tools from dump family can do tricks like that. I use mostly xfs, and xfsdump fits the bill, and beats all perl-lib-accessible methods like Archive::Tar by so wide margin that they're not really in the same competition. ( and on heavilly loaded system this means - xfsdump finishes dump, and tools like tar produce only unusable garbage ).

Now, for compression...lzop is great tool, I find it extremely usefull (it offers very fast compression with compression ratios hovering around what gzip -1 achieves). And it can compress streams, so you shouldn't have any trouble with piping to it and from it from perl.

Another toy, that I found very neat and usefull in Archiving business is rzip. This is definitelly not industrial strength type of solution, because it's very young ( and for your use, it cannot and will not support compressing streams... ), but it easily outperforms bzip2 -9 by a healthy 10-30%.

This is very significant achievement, what is surprising, is that while working on typical backup archives (multi-gigabyte files) it works sometimes several times faster then bzip2, while still outperforming it on compression ratio front.

Of course you need rather healthy machine to run it, because it's working set hovers around 0.5G...