in reply to Using directory handles in recursive functions
I believe if you try this:
you will have better success. Basically, the local() protects the calling function's version of CURRENT_DIR from any changes made locally.sub foo { my $dir = shift; chdir $dir; local *CURRENT_DIR; # Everything else follows }
In a broader issue, let me give you a few hints. If your directory structure is very deep, you will run out of file handles soon. Those of us with (way too much) experience writing this kind of code usually suck the contents into an array and then close the directory handle.
This will also obviate your problem. A common variant on this is to read the contents likeopendir FOO, "$dir" or die "Couldn't open $dir: $!"; @contents = readdir FOO; closedir FOO;
but I do not much care for this. If you are parsing the contents of the directory later, you have just caused perl to walk the array twice when a simple next() could have done the job.@contents = grep !/^\.\.?$/, readdir FOO;
If the directory is too big to suck in at once and you are still running out of file handles, you can do it this way
Probably more than you asked, but I thought I may help you avoid some later pain.sub recurse_me { my $dir = shift; opendir FOO, "$dir" or die "Couldn't open $dir: $!; while( readdir FOO ) { if ( -d $_ ) { push @dirs, $_; } else { # do something interesting } } closedir FOO; recurse_me( $_ ) for ( sort @dirs ); }
Hth,
mikfire
|
|---|