Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I have an set of perl scripts that run in separate processes that all need to write to the same log file.

Essentially, they open the log file in append mode, write one line to it and then close it.

If I do need to lock it, how do I do this?

Thanks,

x

Replies are listed 'Best First'.
RE: Do I need to lock log files?
by chromatic (Archbishop) on Mar 20, 2000 at 23:16 UTC
Re: Do I need to lock log files?
by Wafath (Initiate) on Mar 21, 2000 at 03:10 UTC
    I don't think you do for you case. Try the following code, running multiple copies at the same time:
    #!/usr/local/bin/perl -w my $file = 'd:\log_test'; my $version = $ARGV[0]?$ARGV[0]:'0'; my $count = 0; while ($count++ < 1000) { open OUT, ">>$file" or die "Could not open file $count\n$!\n"; print OUT time," $version lots of useless text $count\n"; close OUT; }
    On my NT machine (P233) 1000 is enough, and takes a few seconds to run. On a solaris machine, 10,000 is needed to give me a few seconds to start multiple copies.<PR> One interesting result of this program is that the times are not necesarily in order. But since all you are doing is writing to the end of the file, the OS seems to take care of it all.
    W
Re: Do I need to lock log files?
by turnstep (Parson) on Mar 24, 2000 at 23:27 UTC
    For the answer from 08:53, "old" version of perl means anything earlier than 5.004. What a great excuse to upgrade! :) Also, in the above code, you should check the return values of flock and seek. Finally, unlocking is not necessary - the close will release the lock quite nicely.

    Here's an interesting quote from the fopen manpage:

    If two separate processes open the same file for append, each process may write freely to the file without fear of destroying output being written by the other. The output from the two processes will be intermixed in the file in the order in which it is written.
    Depends on your OS, of course, but most should be safe, and do not necessarily need file locking (especially if each process is only writing a single line to the file!)
Re: Do I need to lock log files?
by Anonymous Monk on Mar 21, 2000 at 18:53 UTC
    Any time you have multiple processes contending for the same resource, you should implement file locking. open (FILE,">>logfile) or die'; #open the file flock(FILE,2); # lock it up seek(FILE,0,2); # in case someone wrote after the open #print, etc here print FILE "Log data\n"; # old versions of perl might need an explicit flush here flock (FILE,8); #unlock it close(FILE); #close it down