in reply to Fast file and directory operations

"mr_mischief" beat me to the punch whilst typing, but File::Find works to return full pathnames of files in a directory without the need for chdir, readdir, or opendir. Then you can stat, move etc., using those full paths. Of course, you already have the pathname! so i don't understand why you need to chdir anyway (eg. won't "for my $file (<$src/*.imap>) work?), but here's a File::Find demo to consider:
#!/usr/bin/perl use strict; use File::Find; my @dirs=("/home"); my @files=(); sub dirscan { if ($_ =~ /.imap$/) {push @files,$File::Find::name} } find(\&dirscan, @dirs); foreach (@files) {print "$_\n"}
Positive thinking leads me to believe it will be faster.

Replies are listed 'Best First'.
Re^2: Fast file and directory operations
by zentara (Cardinal) on Mar 12, 2008 at 13:20 UTC
    HOWEVER File::Find is manditorily recursive -- or at least, if anyone knows how to change that i'd be happy to know.

    Try counting slashes and prune:

    #!/usr/bin/perl # linux only use warnings; use strict; use File::Find; use File::Spec; if (@ARGV < 2){print "Usage: $0 dir depth\n";exit} my ($path, $depth)= @ARGV; my $abs_path = File::Spec->rel2abs($path); #in case you enter . for di +r my $m = ($abs_path) =~ tr!/!!; #count slashes in top path find (\&found,$abs_path); exit; sub found{ my $n = ($File::Find::name) =~ tr!/!!; #count slashes in file return $File::Find::prune = 1 if $n > ($m + $depth); # do stuff here. #print "$_\n"; #name only print "$File::Find::name\n"; #name with full path }

    or use File::Find::Rule

    #!/usr/bin/perl use warnings; use strict; use File::Find::Rule; my $word = shift || 'perl'; # find all the files of a given directory my $directory = shift || '/home/zentara'; my $depth = shift || 3; my $rule1 = File::Find::Rule->new; $rule1->maxdepth($depth); $rule1->file; $rule1->grep(qr/\Q$word\E/); #$rule->name( '*.pm' ); my @files = $rule1->in($directory); print "@files\n";

    I'm not really a human, but I play one on earth. Cogito ergo sum a bum