Re: Re: To Kill a Meme: while(defined($line = <>))
by sauoq (Abbot) on Nov 03, 2003 at 10:20 UTC
|
Your post gives the impression that it's somehow wrong to be explicit,
That's certainly not what I hoped to convey.
As I said in a reply to Juerd, I relish the fact that I don't have to be explicit with everything, but that's a very different sentiment and I was addressing a very different point.
What I don't like about the whole while-defined-readline business is that most people use it without a clue why they are using it.
Maybe I should have chosen a less violent title. And perhaps, instead of ending the node with "please, try to lay this old habit to rest" I should have said, "if you don't know why you are doing it, don't bother." I really don't care half as much as some seem to think I do now. :-)
But just because it's no longer needed, it's not wrong, doesn't easily lead to mistakes (quite the opposite) nor is it misleading.
I mostly agree with that except that I think it does lead to mistakes. Not coding mistakes, but conceptual mistakes. When it is coded explicitly, it seems to indicate to perl beginners that it is necessary for a common case. Then they start imagining what that common case is and they get it all wrong but they don't know that. Then those assumptions bleed into other code they write.
-sauoq
"My two cents aren't worth a dime.";
| [reply] |
|
|
I mostly agree with that except that I think it does lead to mistakes. Not coding mistakes, but conceptual mistakes. When it is coded explicitly, it seems to indicate that it is necessary for a common case.
Well, generally, the defined test is
necessary. The "common" case is the exception where Perl
is providing the short-cut. If there is any danger, it's
that people get used to writing while ($line = <>), and think that they can also
write: while ($cond and $line = <>) or
$line = <>; while ($line) {.... ; $line = <>}.
Now, I don't think the danger is anything to worry about, but
I've seen cases where the defined() test was missing where it
should have been.
Now, I'd be really interested in hearing what you think is
the common case, and where people get it wrong by using
defined ($line = <>).
Abigail
| [reply] [d/l] [select] |
|
|
Now, I'd be really interested in hearing what you think is the common case, and where people get it wrong by using defined ($line = <>).
The most common case is reading a text file which ends with a newline. And doing so without changing $/.
What people get wrong is that they tend to think they need to check defined() because of blank lines. In other words, they get the mistaken impression that a scalar can contain a newline and be false.
-sauoq
"My two cents aren't worth a dime.";
| [reply] [d/l] [select] |
|
|
|
|
I've seen cases where the defined() test was missing where it should have been.
I'm curious by what criteria you judge that it "should have been." Do you mean that it should be there as a matter of good defensive programming style? Or was there a real likelihood that an error would have resulted eventually? And, if so, what error?
-sauoq
"My two cents aren't worth a dime.";
| [reply] |
|
|
Don't close filehandles (was: To Kill a Meme: while(defined($line = <>)) )
by Aristotle (Chancellor) on Nov 03, 2003 at 19:53 UTC
|
A plea to not close a filehandle because Perl will close it for you?
Actually, this is one thing I always advise people to do - use lexical filehandles in tight scopes, and let their scoping take care of closing. That's much cleaner and easier to maintain than package filehandles.
It's related to the distinction between in control flow GOTO-driven and structured programs: you don't have to figure out the temporal sequence of code to understand where a filehandle comes into play and when its lifecycle ends, all you need is to look at the spatial layout of the source.
Makeshifts last the longest.
| [reply] |
|
|
Isn't it at least theoretically possible for close to return an error? I'm not sure what, if anything could be done to rectify it if it did, but it could be used to alter the course of the rest of the program, even if it was only to log an error and exit.
Is there any way of trapping and handling such an error if you allow an unclosed lexical filehandle to just go out of scope?
I guess if you where using IO::*, you could subclass the DESTROY method, but would the filehandle still be open at that point? Or would an error code be accessible?
Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
Hooray!
Wanted!
| [reply] |
|
|
Isn't it at least theoretically possible for close to return an error?
Yes, but that's normally only important for files open for
writing, sockets and pipes. But usually you don't really
care if you can't close a filehandle that you have open for reading.
Abigail
| [reply] |
|
|
|
|
|
|
I think this (letting perl close file automatically) is a bad idea.
It does happen on NFS that close returns an error (when net connection breaks or quota is exceeded). (It can also happen if the disk is full or physically bad.) But Perl only gives a severe warning in this case, not an error, so if you don't see the warning (because cgi script or
scrolled out of screen by make's messages), it will seem that it's run normally.
| [reply] |
|
|
Again, getting the return value of a close for a handle that's
open for writing is useful. But for something that's open for
reading?
I am going to disagree with anyone who states that filehandles
should be closed automatically, or who states that automatically closing is a bad idea because of the return
value of close, if they don't distinguish the case of having
something open for writing, and having something open for
reading.
Abigail
| [reply] |
|
|
|
|
It's not something I thought of because nearly all of my scripts only read other files and write results to STDOUT.
But it doesn't invalidate the approach, as even if I have to check the return value of close, I'll just put the check at the bottom of the scope. A pity that there's no more declarative way to do the check, but anyway.
If anything, it can still be argued to be better than not using a scope in two ways: a) it reinforces a difference compared to read-only accesses (which won't have a close anywhere) and b) you cannot forget to close the filehandle even if you forget to check whether it was closed properly.
Makeshifts last the longest.
| [reply] |