I did perl -le 'print for <*>' | wc -l, and it listed out over 100000 files (after a while). I don't think the size of your directory is the problem. It still could be the problem if your memory is very limited, but I'd expect you to get a message about that explicitly if that were the case. (The file globbing you do builds up a list of all those 60000+ files in memory at once.)
The problem may be that after mkdir, you or into a string. The string doesn't do anything. You may want:
mkdir "$dst2\\$datestamp2" or die "Can't mkdir '$dst2\\$datestamp2': $!\n";
After that, I expect the move to fail too, but the warn there should be telling you why. If it's not, I wonder if STDERR is being redirected, or there's a $SIG{__WARN__} handler or something.
I'd make your end-on-limit more explosive:
if ( $count > $limit ) { die "limit exceeded: $count files processed"; }
Generally I recommend setting $limit to about 10 and sprinkle a lot of print statements in the loop to make sure all those variables have the values you think they do.
In reply to Re: Perl file operations
by kyle
in thread Perl file operations
by SkullOne
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |