Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Re: Defensive Programming and Audit Trails

by hakkr (Chaplain)
on Aug 06, 2002 at 08:53 UTC ( [id://187962]=note: print w/replies, xml ) Need Help??


in reply to Defensive Programming and Audit Trails

The web server handles all logging in terms of access and error logging for most CGI scripts. The detail of what is actually logged often depends on the quality and existance of error capture code. My approach so as not to have to sift through the webserver error logs is to redirect Stderr to a different file for each script using CGI::Carp qw/set_message/

If your using DBI the database logs all data with greater efficiency than a hand rolled solution. Transactional logs can be created by choosing to turn on automatic binary or text format logging. If you don't want to use the automatic logs databases also often have bundled programs or commands like mysqlhotcopy or mysqldump for backup purposes.

Do you think in these instances logging is therefore unecessary?

Replies are listed 'Best First'.
Re(2): Defensive Programming and Audit Trails
by FoxtrotUniform (Prior) on Aug 06, 2002 at 15:35 UTC

      The web server handles all logging in terms of access and error logging for most CGI scripts.

      ...

      If your using DBI the database logs all data with greater efficiency than a hand rolled solution.

      ...

      Do you think in these instances logging is therefore unecessary?

    Depends on whether what gets logged is what needs to be, I guess. :-) In general, I'd say no. These logs describe what happens at an interface, and misuse of an interface isn't necessarily the same as assumptions being violated. When it is, its signature can be quite unhelpful.

    For example, maybe I'm updating a database through DBI, based on a key built from several bits of data. I pull these different bits out from the input record with regular expression captures, cat them all together (somehow) to build the key, and then do a $sth->execute or similar. Seems reasonable, right?

    If one of my regex matches fails -- maybe the input data are corrupt, or one of the upstream programmers changed the record format slightly, or something -- the best I'm likely to do is get a "use of uninitialized value, foo.pl line 666" warning from perl. I'll build a nonsense key, then try updating on it -- and updating on a key that doesn't exist isn't an error, last I checked, just a null op. Now maybe that warning makes sense to me, but if I'm catting three different strings together, any (or all) of which might be the problem, it's going to take a lot of unnecessary effort to track down the problem.

    If, on the other hand, I do something like:

    my ($key_part) = ($record =~ /$foo/) or warn "***WARNING: input record <$record> doesn't match $foo\n";

    then I know exactly what's wrong, and how it's wrong, just from looking at the logs.

    To me, that's a big win.

    --
    F o x t r o t U n i f o r m
    Found a typo in this node? /msg me

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://187962]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others taking refuge in the Monastery: (6)
As of 2024-04-19 09:59 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found