Re2: problems returning from recursive subroutine
by dragonchild (Archbishop) on Apr 18, 2003 at 13:06 UTC
|
I would offer that the reason to explicitly close everything is to tell your maintainer that you've done so. Another, more important, reason is the same reason I always put the trailing comma in a list of stuff - what if I add more stuff?!? Then, the implicit close happens later, and that may not be good.
Of course, best, in my opinion, is to limit your exposure to connections like handles and $sth's to as small a block as possible, just in case.
------ We are the carpenters and bricklayers of the Information Age. Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement. Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified. | [reply] |
|
|
Should you then, for the same reason, undef your scalars,
and empty your arrays and hashes when you are done, so your
maintainer (if there is a maintainer...) knows you are done?
Besides, if the maintainer doesn't understand that going out
of scope means you are done with it, wouldn't you have bigger
problems than not having an explicite close?
Abigail
| [reply] |
|
|
I make a distinction between connections to outside processes (like databases and files) and internal things (like variables and the like).
The thing that bit me regarding this (because I always used to work with implicit closes) was prepare_cached(). The following snippet, within a function that's called repeatedly, will not work as expected:
sub oft-used_func
{
....
if ($do_db_call)
{
my $sth = $dbh->prepare_cached($sql) || die;
$sth->execute(@bind_vars) || die;
while ($sth->fetch)
{
....
}
# Use implicit close here
# $sth->finish;
}
....
}
The second time you do that DB statement, using prepare_cached() and not having called finish(), you will have a DBI error. Thus, I explcitly close all external things, but implicitly close all internal things.
------ We are the carpenters and bricklayers of the Information Age. Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement. Please remember that I'm crufty and crochety. All opinions are purely mine and all code is untested, unless otherwise specified. | [reply] [d/l] |
|
|
|
|
|
|
Re: Re: problems returning from recursive subroutine
by Anonymous Monk on Apr 18, 2003 at 14:38 UTC
|
In fact close can indeed fail because, for instance, the disk is full and so it cannot flush buffers. In which case you probably want to report that and possibly want to stop processing. And closing a pipe can give you all sorts of error information.
Also according to the documentation for 5.6.1, an explicit close resets $. while an implicit one due to a following open does not. If you are reporting $. and want that to be accurate, then it is better to do an explicit close whether or not you pay attention to its return value. | [reply] |
|
|
I certainly know that a close on a handle that you have
written to can fail because the disk is
full. But we are talking here about closedir.
Closing a directory handle you have only read from.
I ask again, for which kind of failure do you want to
prepare, and which action, different that you would
normally take, do you want to take in case of failure?
Also according to the documentation for 5.6.1, an explicit close resets $. while an implicit one due to a following open
does not. If you are reporting $. and want that to be accurate, then it is better to do an explicit close whether or not you
pay attention to its return value.
Goodie. Here's another random quote from the documentation
of an old version of Perl. Let's take 5.001k for instance.
dump LABEL
This causes an immediate core dump. Primarily
this is so that you can use the undump program to
turn your core dump into an executable binary
after having initialized all your variables at the
beginning of the program. When the new binary is
executed it will begin by executing a goto LABEL
(with all the restrictions that goto suffers).
Think of it as a goto with an intervening core
dump and reincarnation. If LABEL is omitted,
restarts the program from the top. WARNING: any
files opened at the time of the dump will NOT be
open any more when the program is reincarnated,
with possible resulting confusion on the part of
Perl. See also -u option in the perlrun manpage.
It has nothing to do with closing directories, but then,
$. has nothing to do with it either. But you
seem to like posting random trivia of old versions of Perl.
Abigail | [reply] [d/l] [select] |
|
|
You are wrong about your second point. The handle is being closed explicitly as far as that part of the documentation is concerned. It goes out of scope and gets closed by perl; the filehandle visible during the next execution of the block is a different one from that of the previous. It's not the same filehandle being closed implicitly due to a subsequent open.
$ (echo 1 ; echo 2 ; echo 3) > 1
$ (echo 1 ; echo 2 ; echo 3 ; echo 4) > 2
$ perl -le'for(@ARGV) { open my $fh, "<", $_; 1 while <$fh>; print $.
+}' 1 2
3
4
$ perl -le'for(@ARGV) { open my $fh, "<", $_; 1 while <$fh>; print $.
+}' 2 1
4
3
Q.E.D.
Makeshifts last the longest. | [reply] [d/l] |