![]() |
|
Just another Perl shrine | |
PerlMonks |
Re: Automatically distributing and finding files in subdirectoriesby rodion (Chaplain) |
on Jul 18, 2006 at 02:05 UTC ( #561907=note: print w/replies, xml ) | Need Help?? |
This is a case where it's a good idea to use the shell, which is well optimized at handling directories quickly, and has little overhead if you only invoke it once. The shell command "ls -1" can give you a list of files to work with, (or "ls -1t" if you want them in chronological order, as you probably know). You can do this outside perl, or inside perl with a piped file open or with a system() call.
Then you process the text output of ls with perl, which is good at handling text, and also has the tools to make directories and move files. Here's an example which moves the files to sub-directories, putting a fixed number of files (100) in each sub-directory. This is pretty transparent in my book (as you requested). (Assuming you're familliar with the perl increment magic used in the "$subdir++". If not, it's easy to split that into a sting and a $dir_cnt++, and concatenate them.) Update: Changed 'system "ls -l $path"' to 'system "find $path"', thanks to shmem's reminder of the lstat overhead in "ls". Now that I have access Linux and BSDI boxes again, I did some timeings on each. 'find' is about the same speed as opendir(). 'ls -1' takes 60-100% longer, depending on the system. I prefer the 'system "find $path"' version for clarity, but I was totally wrong about the speed advantages. Thanks to graff and shmem for catching what I forgot.
In Section
Seekers of Perl Wisdom
|
|