in reply to Those fork()ing flock()ers...

Do you write a log line multiple times during the forked process. If so, open the file once at the beginning, lock and unlock everytime you need to log a line, but still keep the file open. Even if you are only logging once, the overhead of forking is probably harder on resources than opening the file for appending. The nice thing is that appending will always jump to the end, even if somebody else has printed to the file since you opened the file.

Another possible option is to open a pipe in the parent process, and fork off a child process that is only responsible for receiving the log. Upon thinking about this though, I guess you might have the same problem with possible co-mingling of lines while printing to the pipe buffer.

So, to optimize your system, keep forked children alive as long as possible, and open the file to be locked first thing.

my @a=qw(random brilliant braindead); print $a[rand(@a)];

Replies are listed 'Best First'.
Re: Re: Those fork()ing flock()ers...
by ferrency (Deacon) on Dec 05, 2001 at 19:33 UTC
    Thank you, this is a good idea. I do have one "actual" logfile which may receive more lines than my various "result" files. In this case, since I always know I'll be logging, I can open that one once, and then only open the various "result" files when I have results to output, since I don't always use all of them in every child.

    The more I think about this, the more ridiculously picky I think I'm being :) With a "big calculation, small result" printing out the result is way cheaper than calculating it anyway, so I shouldn't really worry about the extra open's...

    Alan