EchoAngel has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks, I have been programming recently with log files. I would create two print statements. One goes to STDOUT and the other one goes into a global array. My logfile will be created from the global array. So I plan to create this logfile in case my script crashes/dies. Do you guys do this differently? If so how?

Replies are listed 'Best First'.
Re: creating log file question
by osunderdog (Deacon) on Mar 03, 2005 at 23:50 UTC

    I've got a post it on my monitor to go look at Log-Logger and Logfile-Rotate . They are probably worth looking into.

    In general, log files are invaluable for figuring out what happened and when. We use them all over the place. It's important to have a retention plan for old log files. Otherwise you might find out some day that you are flat out of disk space. In some situations we have a watch dog process that will go out and compress log files if they are over some age and eventually will delete them.


    "Look, Shiny Things!" is not a better business strategy than compatibility and reuse.

Re: creating log file question
by saintmike (Vicar) on Mar 04, 2005 at 00:48 UTC
Re: creating log file question
by graff (Chancellor) on Mar 04, 2005 at 01:51 UTC
    I would create two print statements. One goes to STDOUT and the other one goes into a global array. My logfile will be created from the global array. So I plan to create this logfile in case my script crashes/dies.

    Are you printing the same thing to both STDOUT and the global array "log file", or does the log file get different information than STDOUT?

    The normal unix way of doing things is to print output data to STDOUT (e.g. stuff that might be used by a downstream process in a pipeline) and to print log and/or error messages to STDERR, which is typically unbuffered by default.

    If the idea is to handle a crash/die situation by making sure that all of stdout gets saved to a "log" file, unix users would just put a "tee [-a] log.file" command in the pipeline right after the perl script.

    If the log data is different from stdout data, and the idea is to handle a crash/die by saving a trace of how far you got, just print log data to STDERR, and redirect stderr to a file on the command line, or open STDERR to a named file within your script.

    But maybe I'm missing your point, because it wasn't clear from the OP what you're hoping to accomplish.

      I was thinking of adding more data into the log file. I been using tee before. However, I noticed that it would slow applications/scripts i call. So I am trying to avoid it now. From your reading, it seems it would be up to the user to use unix commands to capture the log.
Re: creating log file question
by ZlR (Chaplain) on Mar 04, 2005 at 08:54 UTC
    With Log::Dispatch you can write to 2 (or more) files at the same time, say something to STDOUT and to your log file.

    You create a dispatcher object and add logfiles and log levels to it. Then you call the log method with message and log level and it sends the message to all files that meet the log level defined in the dispatcher object.

    I used it because it's a pure perl module, and fairly simple to use.

    zlr

Re: creating log file question
by mlh2003 (Scribe) on Mar 04, 2005 at 00:22 UTC
    I'd have to agree with osunderdog. Those modules are useful for creating and maintaining log files. I also have a cron script that I run once per day. It scans through the log files for any recent (past 24 hours) important/critical messages such as errors, and have it email those to my admin account. It saves me from going through the log file manually - and being forgetful, I might not do it every day if I had to do it manually :)
    _______
    mlh2003