geekondemand has asked for the wisdom of the Perl Monks concerning the following question:

What is the most reliable way to write error/status logs in applications that fork processes. My particular interest is in logging in the context of HTTP::Proxy and HTTP::Daemon. I am looking for something that will insure that log writes are not interleaved or lost when there are multiple processes writing to logs. I'm primarily looking for something that is writing to text files as opposed to databases or syslogs. I seek your wisdom oh wise monks... may your experience light my way.

Replies are listed 'Best First'.
Re: Proper Logging in Forked Processes
by saintmike (Vicar) on Feb 11, 2005 at 09:09 UTC
    The Log::Log4perl framework provides a Synchronized appender which does exactly that: In ensures that log messages originating from parallel processes get written one by one. So if they all end up in a single file, they get written properly without interleaving. Read this FAQ on how to configure the Log::Log4perl module on CPAN for this.
Re: Proper Logging in Forked Processes
by perrin (Chancellor) on Feb 11, 2005 at 14:04 UTC
    If you open a local file for append, and your writes are relatively short (depends on your OS, but probably something less than 4K), writes will be atomic without any additional locking.