in reply to Using directory handles in recursive functions

If you are playing chdir games, I would first suggest trying File::Find. Understanding that such an answer is not always the best, and there is frequently knowledge to be gleaned from reinventing wheels, I would focus your attention to the local() keyword.

I believe if you try this:

sub foo { my $dir = shift; chdir $dir; local *CURRENT_DIR; # Everything else follows }
you will have better success. Basically, the local() protects the calling function's version of CURRENT_DIR from any changes made locally.

In a broader issue, let me give you a few hints. If your directory structure is very deep, you will run out of file handles soon. Those of us with (way too much) experience writing this kind of code usually suck the contents into an array and then close the directory handle.

opendir FOO, "$dir" or die "Couldn't open $dir: $!"; @contents = readdir FOO; closedir FOO;
This will also obviate your problem. A common variant on this is to read the contents like
@contents = grep !/^\.\.?$/, readdir FOO;
but I do not much care for this. If you are parsing the contents of the directory later, you have just caused perl to walk the array twice when a simple next() could have done the job.

If the directory is too big to suck in at once and you are still running out of file handles, you can do it this way

sub recurse_me { my $dir = shift; opendir FOO, "$dir" or die "Couldn't open $dir: $!; while( readdir FOO ) { if ( -d $_ ) { push @dirs, $_; } else { # do something interesting } } closedir FOO; recurse_me( $_ ) for ( sort @dirs ); }
Probably more than you asked, but I thought I may help you avoid some later pain.

Hth,
mikfire