I understand that the warnings can be useful to have for debugging reasons if (when) things go wrong, especially to pick up run-time errors. I do think, though, that that small(ish) bonus is soon cancelled out when you consider the chore of digging through logs, especially with a large, oft-used application (like a website).
As long as you're aware of why the warning is appearing at design-time, it seems fairly pointless to keep having that warning appear, and get logged, in production.
I see warnings as a very useful debugging tool, rather than something to directly test the correctness of my code - after all, that's what use strict and the compiler are for *grins*. Warnings at best, in my mind, drop hints as to why things are going wrong, not what is wrong.
-- Foxcub
A friend is someone who can see straight through you, yet still enjoy the view. (Anon)
In reply to Re: The -w switch on a web application
by Tanalis
in thread The -w switch on a web application
by Heidegger
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |