hi, if I had a cron script delete files every 4 hrs, 6x a day, would i have to be concerned with the script timing out if it was to delete 100,000-200,000 10kb size files? The files will never exceed 10 kb's.
The file sizes should be irrelevant in most filesystems that I know of. But timed out by what? Do you mean overlapping with the next run? I doubt so. Testing on XP with ntfs, which by far I doubt to be the most efficient situation:
C:\temp>mkdir test C:\temp>cd test C:\temp\test>perl -e "for (1..100_000) { open my $f, '>', $_ or die $! + }" C:\temp\test>perl -le "$n=time; unlink 1..100_000; print time-$n" 55
In reply to Re: unlinking performance
by blazar
in thread unlinking performance
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |