Then you process the text output of ls with perl, which is good at handling text, and also has the tools to make directories and move files.
Here's an example which moves the files to sub-directories, putting a fixed number of files (100) in each sub-directory.
This is pretty transparent in my book (as you requested). (Assuming you're familliar with the perl increment magic used in the "$subdir++". If not, it's easy to split that into a sting and a $dir_cnt++, and concatenate them.)my $path = '/var/local/path/to/files'; my $subdir = 'lower0'; my $file_cnt = 0; my $name; my @files = system "find $path" ==0 or die "system ls failed with $?"; while ($name = shift @files) { chomp $name; mkdir $path.'/'.$subdir++ if $file_cnt++ %100; rename "$path/$name","$path/$subdir/$name"; } # untested (I'm not near a unix box)
Update: Changed 'system "ls -l $path"' to 'system "find $path"', thanks to shmem's reminder of the lstat overhead in "ls".
Now that I have access Linux and BSDI boxes again, I did some timeings on each. 'find' is about the same speed as opendir(). 'ls -1' takes 60-100% longer, depending on the system.
I prefer the 'system "find $path"' version for clarity, but I was totally wrong about the speed advantages. Thanks to graff and shmem for catching what I forgot.
In reply to Re: Automatically distributing and finding files in subdirectories
by rodion
in thread Automatically distributing and finding files in subdirectories
by myuserid7
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |