in reply to File::Copy and file manipulation performance

Many filesystems don't deal with large numbers of files in the same directory; exactly what counts as "large" varies by OS, version, filesystem type, etc but you're obviously bumping against it.

One common scheme is to setup multiple directories (named after hex digits for example so you've got "0".."9", "a".."f") and use some sort of hashing function to assign a new file to a subdirectory. That'll reduce the number of files in any one directory to 1/16th of what it was. Add more subdirs ("00".."99", "a0".."ff") or more levels ("0/0", "0/1", . . . .) to spread things out more if required.

  • Comment on Re: File::Copy and file manipulation performance