in reply to -s takes too long on 15,000 files

VSarkiss's suggestion to separate the directory read from the file stat allows you to be more perlish, though it won't solve your ultimate problem (molasses-like run times on your PC). The following idiom (adapted from an example in the Camel) puts the names of all files called '*.log' or '*.LOG'¹ in DIR into the @logfiles array:
my @logfiles = grep { $_ =~ /\.log$/i } readdir DIR;
You should use the /i switch in cases like this rather than 'lc', since that changes the filename that you read and that's not really what you want (as tilly pointed out).

What's the spec of the PC that you're using? On our oldest, slowest PC (P233/64Mb, Win98) this operation takes 7 minutes, compared to just under 5 minutes on a newer but not super slick PC (Celeron 500/32Mb, WinMe), accessing the files over our LAN (3 mins when the files were moved to a local directory). By contrast, this runs in less than a second on a Sun Ultra 5...

¹(Update) It will also pick up files called *.Log, *.LOg, *.lOg, etc. I assume that's probably not a big deal.