in reply to -s takes too long on 15,000 files

A couple of things to try. First, it's not clear from your snippet whether you're concatenating the directory to each file or not. Given a large directory, you're better off doing a chdir there, then opening and reading from ".", because each stat doesn't have to traverse the path. (It's probably cached, depending on your kernel, but this way it won't even matter.) As I said, maybe you're already doing this, but it's not clear from the code.

Another thing is to separate reading the directory from stat'ing it. That is, slurp the directory contents in one go, then traverse through the array. This will eat up a lot more memory, but it may be a worthwhile tradeoff (you'll have to make that call). Combining those, your loop would look like something like this:

my $dir = '/some/where/'; chdir $dir or die "Can't chdir $dir\n"; opendir D, '.' or die "Can't opendir '.'\n"; my @files = readdir(D); closedir D; my %filesize; foreach my $f (@files) { $filesize{$f} = -s $f; }

None of these are guaranteed to make it faster; that actually depends on your underlying operating system more. But they may be worth trying.

HTH