I try to avoid such overkill when I'm just recursing down 2 directories and throwing the structure in a HoA.
You keep saying overkill. You said it about File::Find itself, and you say it here. I think its a bad argument. File::Find uses Carp, Exporter and Cwd either directly or indirectly. Exporter gets included with most modules, Cwd is pretty common and even still isnt that big, Carp is like Exporter, its almost always being used. (Just saying use warnings; brings it into existance. As for File::Find itself it isnt that large either. So I think your overkill argument is bogus. Especially when it allows you to write:
use File::Find; my @files; find {no_chdir =>1, wanted => sub{-d and return; push @files,$_ } },@r +oots;
Putting the files into the desired data structure is left as an excercise for you. I might just say that I suspect File::Spec will come in useful.
The point here is that in trying to avoid overkill that doesn't exist you have wasted a bunch of time trying to partially reinvent a wheel. You seem to be more concerned with the startup time of your script (which is mostly determined by the OS and physical resource bottlenecks) and not the time it takes you to write the script. I bet you would have to run your program literally thousands of times before you will make up the lost time that you personally have suffered.
This comes down to premature optimisation. You are trying to a make a process faster when you have absolutely no evidence that that process is suffering performance problems. (You cant have this evidence as you havent written the script yet have you?) So you have comitted two golden gaffs, the first is to optimise prematurely, and the second is to badly reinvent a wheel to do so. This is hardly an efficient use of your time.
In reply to Re: Re: Re: Directory Recursion Disorder
by demerphq
in thread Directory Recursion Disorder
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |