in reply to Managing a directory with millions of files

Filling an array with millions of files just to get a total count doesn't seem very resource friendly to me. Maybe something more like:
use strict; use warnings; my $dir = shift; die "Usage: $0 directory" unless defined $dir; opendir DIR, "$dir" or die "Could not open $dir: $!\n"; my $count = 0; $count++ while defined readdir DIR; closedir DIR; print "$count\n";
A good tool shouldn't put a strain on low memory environments.

Replies are listed 'Best First'.
Re^2: Managing a directory with millions of files
by markhu (Initiate) on Mar 01, 2008 at 19:10 UTC
    touche', but in defense of the OP, his use of a built-in perl slurpish function in the first algorithm is the foundation upon which the second is built. And he implied he'd hand-typed these in a few times, so the smaller/simpler the better, even if only by one line/several keywords. And finally, he didn't say the resource being taxed was the RAM, but the filesystem. The disk subsystem generally being the weak point in modern servers with multi-gigabyte RAM and fast CPU(s).