in reply to Archive::Zip performance question

Jonathan,

Besides the rewriting issue there are some other optimizations possible.

Rule 1 of optimization: Don't do it. If this is a nightly backup job: Who cares about run time?

Something seems to be wrong with your file organization. Your single directory has at least 3500 files and still needs picking out text files. What kind of mess is this?

Don't reinvent the wheel. Assuming a clean directory structure:  system("$commandline_packer $options $source_dir $zipfile"). Or even leave file globbing to the command interpreter: system("$commandline_packer $options *.txt $zipfile"). Estimated speed increase: Factor 3 to 10. 7-Zip is the GPL successor to Winzip on my system. This optimization is how it was done for ages.

Assuming many short text files the most time will be spent by disk seeks. During that time the CPU is idle. Using two threads the previous file can be compressed during seek and read of another.

At the numbers as large as you have, it even might pay out to help the OS in reducing disk seeks. The idea is to create your file list, stat() it (needs to be done anyway implicitely) and sort by inode. Only then start reading. Assuming a more or less continous mapping between inode and physical disk location you reduced the number of disk seeks to a minimum. Sometimes you are able to hear and see this optimization. I'm not sure whether stat() has inode numbers on Windows. It did not on my outddated ActiveState Perl version.

Archive::Zip is much more usefull if the files have been munged by perl before compression. TMTOWTDI.