Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Obviously there are many ways to do this, but what is the quickest most efficient way? (and if i use opendir, how would i keep it from counting "." and "..", too?)
  • Comment on Quickest, most efficient way to get number of files in dir?

Replies are listed 'Best First'.
•Re: Quickest, most efficient way to get number of files in dir?
by merlyn (Sage) on Mar 18, 2003 at 03:21 UTC
    opendir D, "somedir" or die; my $count = () = readdir D; $count -= 2; # :-) closedir D;
    Although you might benchmark that against this for both small and huge directories:
    opendir D, "somedir" or die; my $count = -2; $count++ while readdir D; closedir D;

    -- Randal L. Schwartz, Perl hacker
    Be sure to read my standard disclaimer if this is a reply.

Re: Quickest, most efficient way to get number of files in dir?
by robartes (Priest) on Mar 18, 2003 at 07:36 UTC
    Just a quick benchmark. I've used both of merlyn's variations on readdir and thrown in glob as well (just for giggles, as is shown in the benchmark results). Here's the code:
    #!/usr/local/bin/perl -w use strict; use Benchmark qw(:all); my $dir="/home/bv/tmp/testdir"; cmpthese('1000', { 'merlyn_1' => \&merlyn1, 'merlyn_2' => \&merlyn2, 'glob' => \&glob, } ); sub merlyn1 { opendir D, $dir or die; my $count = () = readdir D; $count -= 2; # :-) closedir D; } sub merlyn2 { opendir D, $dir or die; my $count = -2; $count++ while readdir D; closedir D; } sub glob { my $count = () = glob ("$dir/*"); $count += () = glob ("$dir/.[^.]*"); $count-=2; }
    With 255 (empty) files, this is the result:
    Benchmark: timing 1000 iterations of glob, merlyn_1, merlyn_2... glob: 6 wallclock secs ( 4.55 usr + 1.55 sys = 6.10 CPU) @ 16 +3.93/s (n=1000) merlyn_1: 1 wallclock secs ( 0.37 usr + 0.13 sys = 0.50 CPU) @ 20 +00.00/s (n=1000) merlyn_2: 1 wallclock secs ( 0.47 usr + 0.17 sys = 0.64 CPU) @ 15 +62.50/s (n=1000) Rate glob merlyn_2 merlyn_1 glob 164/s -- -90% -92% merlyn_2 1562/s 853% -- -22% merlyn_1 2000/s 1120% 28% --
    That's clear. Glob is very much out of the picture, while merlyn_1 is about a fifth to a quarter faster again than merlyn_2.

    Let's try 1024 files:

    Rate glob merlyn_2 merlyn_1 glob 37.0/s -- -91% -93% merlyn_2 400/s 982% -- -22% merlyn_1 513/s 1287% 28% --
    Same result. merlyn_1 is the clear winner! I'm guessing that this is because we are not doing an increment to $count after every file, but that's just a wild guess.

    Update: For some extra giggles, here's the result for 10K files:

    Rate glob merlyn_2 merlyn_1 glob 3.04/s -- -92% -93% merlyn_2 38.0/s 1150% -- -7% merlyn_1 40.7/s 1242% 7% --

    CU
    Robartes-