IMO, it depends. I usually (almost always) turn warnings on, but there might be reasons to turn them off. The question you have to ask yourself is, where are the warnings going? And what action is taken by the people seeing the warnings. Take for instance a CGI program. If things are setup that whatever it writes to STDERR is send to the browser (not that I recommend it to set up your server that way - but you aren't always in control of that), generating warnings may do more damage than good. If the warnings are going to be burried in a log file that gets a few million lines a day, it doesn't really matter whether you have warnings on or not, although with warnings, you still have something to dig for. If you have a program that's run from cron, and cron dutifully mails STDERR to a knowledgable person, by all means, keep warnings on. However, suppose you have a program that runs some support, and generates an excel file. It's run by people who have no idea what "Use of uninitialized value in print" means, and have no idea who is in charge of the program. There's no added value in having warnings turned on.
So, my recommendation is "keep warnings turned on, unless it does more harm than good".
Due to the way our system is architected, it likes to crap out if a script generates a warning, but it will do so silently. There's no way for us to know if a script dived out because of a warning or successfully sent its output to the next system.
I suggest you fix this. Since some warnings can't be turned off and the program can write to STDERR for other reasons as well, you'd benefit from fixing your architecture, regardless of the "warnings on/off" question.
Coworker #2 adds enabling warnings comes at a performance cost. Is this even true?
Of course it's true. *Everything* comes at a performance cost. Using a loop instead of repeating lines of code comes at a performance cost. Factoring out code and putting them into subroutines or modules comes at a performance cost. Using objects comes at a performance cost. Using Perl instead of C comes at a huge performance cost. Using general purpose Apache instead of a webserver specially designed and tuned to handle your application comes at a performance cost. Using warnings comes with a performance cost. It's miniscule and typically only used as a strawman argument (which isn't uncommon in the Perl world. Non-measurable performance is used as an argument both by newbies and veterans alike).
I counter via absurdity that why don't we develop without use strict?
Not a valid counterpoint actually. Most of the cost of 'strict' is payed at compile time, while most of the cost of 'warnings' is payed at run time.
There's one point that hasn't been mentioned. Perl isn't always right about its warnings messages. Sometimes it generates a warning because it thinks the programmer made a mistake (usually compile time warnings), when the programmer didn't. Instead of cluttering the code with 'no warning' blocks, one might turn off warnings all together. But then you would do it from the start, not just in production.
In reply to Re: Warnings and Strict in Production/Performance
by Anonymous Monk
in thread Warnings and Strict in Production/Performance
by deep submerge
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |