in reply to problems returning from recursive subroutine

The DIR handle is GLOBAL. Therefore the recursive call clobbers (overwrites) the "parent"'s dirhandle so when the call returns the dirhandle is at the end of the list and therefore cannot return anything.

You have to add

local *DIR;
into the subroutine, above the opendir or use a lexical dirhandle:
sub lc_filenames{ my($dir)=@_; $dir||="."; opendir my $DIR, $dir; while(defined(my $file=readdir $DIR)){ ...
This syntax will not work in older Perls! You may need to add use FileHandle; and declare the $DIR above the opendir like this:
my $DIR; opendir $DIR, $dir;

Another nit. You do not close the dirhandle! You should!

Jenda
Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live.
   -- Rick Osborne

Edit by castaway: Closed small tag in signature

Replies are listed 'Best First'.
Re: problems returning from recursive subroutine
by Abigail-II (Bishop) on Apr 18, 2003 at 10:02 UTC
    You do not close the dirhandle! You should!

    Why? Perl will close the handle automatically when it goes out of scope. Closing it yourself only makes sense if you are interested in the return value, and are going to do something different depending on the value. Would you expect a close() to fail, and if it fails, what action would you take?

    Abigail

      I would offer that the reason to explicitly close everything is to tell your maintainer that you've done so.

      Another, more important, reason is the same reason I always put the trailing comma in a list of stuff - what if I add more stuff?!? Then, the implicit close happens later, and that may not be good.

      Of course, best, in my opinion, is to limit your exposure to connections like handles and $sth's to as small a block as possible, just in case.

      ------
      We are the carpenters and bricklayers of the Information Age.

      Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement.

      Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified.

        Should you then, for the same reason, undef your scalars, and empty your arrays and hashes when you are done, so your maintainer (if there is a maintainer...) knows you are done? Besides, if the maintainer doesn't understand that going out of scope means you are done with it, wouldn't you have bigger problems than not having an explicite close?

        Abigail

      In fact close can indeed fail because, for instance, the disk is full and so it cannot flush buffers. In which case you probably want to report that and possibly want to stop processing. And closing a pipe can give you all sorts of error information.

      Also according to the documentation for 5.6.1, an explicit close resets $. while an implicit one due to a following open does not. If you are reporting $. and want that to be accurate, then it is better to do an explicit close whether or not you pay attention to its return value.

        You are wrong about your second point. The handle is being closed explicitly as far as that part of the documentation is concerned. It goes out of scope and gets closed by perl; the filehandle visible during the next execution of the block is a different one from that of the previous. It's not the same filehandle being closed implicitly due to a subsequent open.
        $ (echo 1 ; echo 2 ; echo 3) > 1 $ (echo 1 ; echo 2 ; echo 3 ; echo 4) > 2 $ perl -le'for(@ARGV) { open my $fh, "<", $_; 1 while <$fh>; print $. +}' 1 2 3 4 $ perl -le'for(@ARGV) { open my $fh, "<", $_; 1 while <$fh>; print $. +}' 2 1 4 3
        Q.E.D.

        Makeshifts last the longest.

        I certainly know that a close on a handle that you have written to can fail because the disk is full. But we are talking here about closedir. Closing a directory handle you have only read from.

        I ask again, for which kind of failure do you want to prepare, and which action, different that you would normally take, do you want to take in case of failure?

        Also according to the documentation for 5.6.1, an explicit close resets $. while an implicit one due to a following open does not. If you are reporting $. and want that to be accurate, then it is better to do an explicit close whether or not you pay attention to its return value.

        Goodie. Here's another random quote from the documentation of an old version of Perl. Let's take 5.001k for instance.

        dump LABEL This causes an immediate core dump. Primarily this is so that you can use the undump program to turn your core dump into an executable binary after having initialized all your variables at the beginning of the program. When the new binary is executed it will begin by executing a goto LABEL (with all the restrictions that goto suffers). Think of it as a goto with an intervening core dump and reincarnation. If LABEL is omitted, restarts the program from the top. WARNING: any files opened at the time of the dump will NOT be open any more when the program is reincarnated, with possible resulting confusion on the part of Perl. See also -u option in the perlrun manpage.

        It has nothing to do with closing directories, but then, $. has nothing to do with it either. But you seem to like posting random trivia of old versions of Perl.

        Abigail