in reply to -s takes too long on 15,000 files

It's the -s bit that slows down the program, the stat takes about 5 mins to execute on 10,000 files, while 'ls' throws a listing up in less than 10 secs, and similar performance for Windows 'dir'.

There is something weird about all of this. I did a few tests on a 700mhz W2k box and averaged around 3 seconds to stat 20000 files in a local directory. Doing the readdir into an array had a negligable effect on performance. I also compared against a backticked dir command (I didn't parse it) which took 1 second.

My only idea as to why you are seeing the performance you are is that you are going over a network, but you already said the directories where local so there must be something else going on.

Incidentally I wrote an MD5 dupe checker once and it usually ran over 40k small files spread through a flat directory tree in a matter of minutes, including unlinks.

Yves
--
You are not ready to use symrefs unless you already know why they are bad. -- tadmc (CLPM)