bobr has asked for the wisdom of the Perl Monks concerning the following question:

Hello monks,

I am using Log::Log4perl as logger in my fastcgi based web application (in apache on WinXP). There is multiple processes with same code, so all pieces is writing to same log file. I wondered if it is a problem, so made simple test:

use Log::Log4perl qw(:easy); Log::Log4perl->init(\ qq{ log4perl.logger=DEBUG, A1 log4perl.appender.A1 = Log::Log4perl::Appender::File log4perl.appender.A1.filename = test2.log log4perl.appender.A1.syswrite = 1 log4perl.appender.A1.layout = SimpleLayout }); my $id = shift; for (1..20) { DEBUG "tick $_ from $id"; sleep(1); }

The script above (in file log_test.pl) is runned ten times with xargs like this

perl -E "say for 1..10" | xargs -n 1 -P 10 perl log_test.pl

Log file should contain 20 ticks from 10 processes

... DEBUG - tick 1 from 4 DEBUG - tick 1 from 5 DEBUG - tick 2 from 9 DEBUG - tick 2 from 10 ...

Unfortunately, there few entries missing. Typically it is like 5-10 entries out of 200 expected. I tried to enable/disable syswrite option, but it end up with similar results. I also tried to use Log::Log4perl::Appender::Synchronized, but it seems that IPC::SysV it is using does not work on Windows.

Any idea for better approach/solution?

-- thanks, Roman

Replies are listed 'Best First'.
Re: Multiple processes using Log::Log4perl to write into single log
by bobr (Monk) on Sep 20, 2010 at 13:57 UTC
    I played more with example above and found a configuration working properly:
    Log::Log4perl->init(\ qq{ log4perl.logger = DEBUG, A1 log4perl.appender.A1 = Log::Dispatch::File::Locked log4perl.appender.A1.filename = test5.log log4perl.appender.A1.mode = append log4perl.appender.A1.close_after_write = 1 log4perl.appender.A1.layout = SimpleLayout });

    Looks like the Log::Dispatch::File::Locked appender is doing better job than Log::Log4perl::Appender::File in this case.

    -- Roman

Re: Multiple processes using Log::Log4perl to write into single log
by locked_user sundialsvc4 (Abbot) on Sep 20, 2010 at 17:18 UTC

    Just a thought, but...   might there be a way to write the log information to some kind of pipe?  (Yes, they have ’em in WinXP too.)   This pipe would be opened for writing by many processes, but would be consumed by one logging process that did nothing else but read the entries and write them to the log.

    What this (might...) do is to avoid contention for the log file, which otherwise could oblige the multiple worker processes to, in effect, run sequentially.

      I was thinking to replace current file based approach with something else. There are several other options, but I did not dig too much into them:
      • DBI appender can write into database. That might also have other advantages like easy remote access to the log
      • Some kind of message queue. I found RabbitMQ, but have no experience with such
      • Have one log for each process and merge them before analysis

      Any other idea?

      -- thanks, Roman