I don't see how the size of the files in a directory would affect File::Find, data structure preparation and shuffling the list. I would say that number of files in a directory would have an effect. What is the number of files per directory?
I would think File::Random would also be slow if the number of files in a directory is huge. A quick check of the code in File::Random make me think that it re-reads the whole directory every time you call random_file() and therefore you could get the same file returned more than one time.
In reply to Re^3: Random Files Recursively Form Directory
by CountOrlok
in thread Random Files Recursively Form Directory
by gautamparimoo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |