Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re^2: What is the best way to dump data structures to logfiles?

by monsieur_champs (Curate)
on Mar 14, 2007 at 10:07 UTC ( [id://604763]=note: print w/replies, xml ) Need Help??


in reply to Re: What is the best way to dump data structures to logfiles?
in thread What is the best way to dump data structures to logfiles?

Dear talexb
maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

Of course, maybe good comments here from the fellow monks helped me make up my mind about this and see the module in a different way, now.

Maybe I can confine it on the dungeons of my logging system, and keep it well-feed with data to dump, and I will be happy, and the operations guys (that are mostly non-programmers) will also be able to tell something useful from the dumped data.

Thank you very much for your considerations.

Replies are listed 'Best First'.
Re^3: What is the best way to dump data structures to logfiles?
by talexb (Chancellor) on Mar 14, 2007 at 14:26 UTC
      maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

    My query was meant to be helpful not :) accusatory -- I was just trying to challenge your thinking, and get you to back up, or prove your case.

    Perhaps you could do full logging in a file that only you see, and abbreviated logging to a file that the operations guys see. Or use the same file, and tell Operations to ignore anything at an INFO or DEBUG level.

    But by all means dump lots of data if it will help you better understand exactly how your system is working. I'd guess that in six months you'll be confident enough that you'll be able to reduce the logging level. Another strategy would be to bump the logging level up under certain unusual conditions -- that way you log lots of data only in the cases you're really interested in. Or do the inverse -- leave the logging level high, but reduce it once you discover you're doing an ordinary transaction.

    Alex / talexb / Toronto

    "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://604763]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (6)
As of 2024-03-28 22:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found