If logging to a *file* is not a requirement, but just an option, you might just as well log to a database. With a decent database system, a lot of problems customarily associated with handling logs just disappear. Still another option might be to use Sys::Syslog or Unix::Syslog.
Christian Lemburg
Brainbench MVP for Perl
http://www.brainbench.com
| [reply] |
| [reply] |
Unless you want to mess about with semaphores I suggest you stick with the old lock system. | [reply] |
I'm not sure I understood, but if I did what you want is actually to unlock:
use Fcntl ':flock';
#...
flock(FH,LOCK_UN);
#Or:
flock(FH,8);
| [reply] [d/l] |
While locks of files can be removed and the file handle held open between writes, this should not be done unless you also remember to resynchronise the file handle when you subsequently reacquire a lock on the file handle. Eg.
flock (FH, LOCK_EX); # acquire exclusive lock
seek (FH, 1, 0); # resynchronise the file buffer
Failure to resynchronise the file handle can cause many problems where multiple processes are accessing and updating the file handle simultaneously - This aspect of file locking was discussed in detail in this thread.
perl -e 's&&rob@cowsnet.com.au&&&split/[@.]/&&s&.com.&_&&&print'
| [reply] [d/l] |
If the process which is forking them is avaliable to do stuff then you could fork them with an open(FH, "-|") in which case the forked stdout would go to that file handle for the main process to collate the all togeather.
Another method is to have each one logging to seperate files and then the main process when it closes sticks them all togeather, but thats a bit dodgy I would say.
But otherwise if your main process does need to do stuff then you probabley should go for the open+lock+write+close
I'm not sure but you might not have to open it everytime, just lock it, but hold me to that.
Toby | [reply] |
There are several logging packages on CPAN that should handle this. The syslog route might be the simplest. | [reply] |