Hmmm, File::Find is really for delving into the subdirs,
I know, and it's difficult to grok sometimes how to use it correctly - the whole "prune/not prune" thing is still unclear to me. But besides that...
and if you don't want that, then just glob(*) like this:
For a while now I've been pondering which was faster, a glob or a readdir, so I decided to test it on a big directory I've got lying around (36K files).
The answer?
glob failed (child exited with status 1) at trial.pl line 12.
Nuts. Do I recall somewhere that Perl relies on the shell for globbing in some way? If so, going with a readdir on a big directory may be your only choice.
As for speed issues, you have to loop over everyfile, no matter what,
And, although I can't speak for NTFS/HPFS/FAT filesystems, I know that a flat directory on Solaris or Linux is going to be a serious dog to scan over 1000 files, no matter what.
Implement a heirarchical subdirectory scheme - maybe based on the date, which would simplify purging, too.
That's what I did, faced with a similar problem, anyway. :-)
Peace,
-McD
In reply to Re: Re: Deleting Files
by McD
in thread Deleting Files
by BatGnat
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |