in reply to Multiple processes, one log file?

I implemented something similar (though not using log4perl) using flock. You just get an exclusive lock on the logfile before writing, then write, then release the lock.

Before that, my experience has been that if you write less than the system block size or MAX_PIPE (4K on many systems), you can get away without locking. I'm not sure if that's guaranteed; I remember reading it once, but I've been looking for the reference for two years without finding it, so I can't say for sure. :)

Replies are listed 'Best First'.
Re^2: Multiple processes, one log file?
by cosimo (Hermit) on May 20, 2005 at 15:19 UTC

    In fact it is not guaranteed. We reinvented our log wheel and 95% of log activity is on the same log file, and we find it very useful.

    From perl docs, various web examples, this merlyn's webcolumn (written when I didn't even know what perl was) and extensive™ testing, I learnt that using flock() with LOCK_EX flag is the only way to have consistent updates/writes by many processes onto a single log file (or page-counters, for that matter), though this method is not reliable across platforms.

    Not using flock resulted in mangled log files, with different processes log buffers being mixed and truncated at random.