Just another Perl shrine | |
PerlMonks |
To warn or not to warn, that is the questionby Ovid (Cardinal) |
on Dec 03, 2000 at 11:03 UTC ( [id://44639]=perlquestion: print w/replies, xml ) | Need Help?? |
Ovid has asked for the wisdom of the Perl Monks concerning the following question:
I'm almost thinking this should be in Meditations instead of Seekers of Perl Wisdom, but since I do have a couple of questions, Seekers it is.
I've been thinking recently about using warnings in Perl. Turning on warnings during development is a great thing and should be encouraged. However, sometimes warnings warn you of things that are not really appropriate and you are forced to add code to check for this: In the code above, what if some_sub() returns undef? If you have warnings turned on, Perl will complain that you've tried to use an uninitialized value. So what? I know that an uninitilized value evaluates as zero, so the above code should run fine (assuming that an undef value is not indicative of an issue that I should be addressing in my code). If I want to suppress the warning, I can (in 5.6) use lexically scoped warnings. This is overkill. Prior to 5.6, I could use dynamically scoped suppression of warnings with local $^W. Again, this seems like overkill if I am just turning off warnings for a conditional. The simplest, and most common, way of disabling this warning is the following: Well, that's just great. Now I've added some overhead to my program to test for something that's really only a problem because I've used warnings. Ugh! Why should I care if $x is defined if it's just going to evaluate as zero? If the conditional were $x < 2, then I might be worried about whether I want to distinguish an undefined value from zero, but that's not the case here. Going to such lengths to worry about optimizations seems a bit ridiculous, but to quote gnat in http://prometheus.frii.com/~gnat/yapc/1999-lies/slide3.html: Speed does matter, just ask your girlfriend. The difference between 2 seconds and 3 seconds is, duh, 1 second. Do that 100,000 times, though, and you've got over a day being wasted. Not every speed problem is the fault of a poor algorithm.Since I wanted to address this issue, I decided to try a benchmark: This produced the following output on my box: What? The extra test for $_ does not appear to have affected the execution speed. I have run this several times (and for up to 30 CPU seconds for each) and I am getting this result consistently. Am I missing something here? Shouldn't testing whether a variable is defined have some impact on execution speed, no matter how minimal or have I just created a poor benchmark? I would also like to know the opinion of other monks regarding their opinions regarding irrelevant tests for "definedness". I know that usually warnings for use of undefined variables is a good thing, but I don't want to test for something I don't need to. Additionally, such erros can fill up my error logs and make finding important errors more difficult (though I confess that often those "undefined" errors are highly relevant).
Cheers, Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.
Back to
Seekers of Perl Wisdom
|
|