”…a few hundred PDF's…”
One more thing occurred to me: PDFs can hardly be compressed at all. I've just tried it out, a 27k PDF becomes a 25k PDF, for example - not really interesting. And since there are so many of them, performance probably plays a bit of a role. In my opinion, it would be more natural to use tar or rather Archive::Tar. I haven't measured it, but it probably performs much better. Just as an aside and as a reminder: There are these age-old comparisons of the performance of cp, rsync and tar. As far as I remember, tar has always performed better than zip for operations on many files. I know - there is no compression involved, but just as a general statement on the good performance of tar.
Minor update: striked out irrelevant content.
In reply to Re: Zipping the contents of a directory by filename
by karlgoethebier
in thread Zipping the contents of a directory by filename
by justin423
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |