We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. (Donald Knuth)
Though this statement is not applicable in all situations it certainly is in this one.
Why do you think that
"optimiz<ing> it by having _as many_ files as possible in the 'wanted' function would affect the performance? The only thing here you could call (minimal) "overhead" is the repeated calls to "wanted", but that's just the price you pay for using File::Find.
Why do you think that the OS would
"move(...) the files all together in _one shot_ or at least in big chunks"? It moves them file by file.
Update: If you perceive performance differences between your perl version and a shell based equivalent, that's because File::Copy moves a file by
- opening the target for writing
- reading from source and write to the target
- unlinking/deleting the source
Depending on OS and filesystem this is less then optimal but also not OS dependent.
If that's no concern you could push the files found by "wanted()" into an array (or better yet use
File::Find::Rule, which doesnt have this clunky callback interface) and feed that to the systems "mv", using proper quoting.
holli
You can lead your users to water, but alas, you cannot drown them.