In the end I was able to do more or less what I had in mind at the beginning. This implementation is recursive (I wanted to mix in both things) but behaves like an iterator using the closure: It uses two queues (one for read filenames and another for directories to be read) but when a directory is read it iterates over itself until a file is found; still, the interface is an iterator.
{
my @fq; # files
my @dq; # directories
my $currentDir;
sub initDir { push @dq, shift; };
sub getNextFile {
my $f;
if ( $#fq >= 0 ) {
$f = pop @fq;
} else {
if ( $#dq >= 0 ) {
my $r; # filename read
local *D;
$currentDir = pop @dq;
opendir(D, $currentDir) || die "$currentDir: $!";
while ( defined( $r = readdir( D ) ) ) {
next if $r =~ /^\.{1,3}$/;
my $ff = $currentDir . '/' . $r;
if ( -d $ff ) {
push @dq, $ff;
} else {
push @fq, $ff;
}
}
closedir D;
$f = getNextFile();
} else {
$f = undef;
}
}
return $f;
}
}
my $dir = "c:/inetpub";
# execution start here
initDir( $dir );
my $f;
my $i = 0 ;
while ( defined( $f = getNextFile() ) ) {
$i++;
print "[$i] $f \n";
}
What do you think about it? is there anything fundamentally flawed with this approach?