Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I have a file which store a counter, whenever the CGI script is accessed, the counter in that file will be incremented. I have used flock() to ensure the file is locked before updating it, but sometimes it get resetted to 0 when the traffic is high. Did I write something wrong in my script? Or is it becuase of the file locking machanism? Is there any work around on this? I am using Redhat Linux 9, Apache 2.0 with perl 5.8.3

Does anyone of you encounter this problem before? Thanks!

The following are the counter code that I am using:

#!/usr/local/bin/perl use strict; use CGI qw(:all); my $file = 'counter.txt'; my $counter = 0; { local *FH; open(FH, "+<$file") || die $!; flock(FH, 2) || die $!; $counter = <FH>; seek(FH, 0, 0); truncate(FH, 0); print FH $counter+1; close(FH); } print header(); print "counter[$counter]\n";
Regards,
Perl Beginner

Replies are listed 'Best First'.
Re: Concurrent file access with flock()
by Zaxo (Archbishop) on Jun 17, 2004 at 16:59 UTC

    Your file locking appears to be fine. I can't be sure, because I don't know that LOCK_EX == 2 on your system. It would be good to add use Fcntl qw(:flock); to the top and use your constants by name. I'd also encourage you to use a lexical file handle instead of localizing *FH. A lexical handle is closed when its name goes out of scope and no other references exist.

    Several question that may point to the problem: Are you sure no other script is accessing the counter? Does your OS support flock in all its glory? Is this the actual code, or is it trimmed for the question?

    That is some commendably clean code.

    After Compline,
    Zaxo

      Hi Zaxo,

      Here's my comment:

      1. I am quite sure that LOCK_EX == 2, so it should be fine.

      2. Correct me if I am wrong, most of the example that I see uses localizing *FH for file handling issue. This will also goes out of scope since I put it inside a scope.

      3. Actually, this is not the actual code, I put it here just to simplify things. In actual, there are few different perl script that actually access the counter file, but it is access via a common package (with locking of course). I even have some script accessing the counter via mod_perl 1.99.

      4. RedHat does have support for flock(), I have been using it all the while and it doesn't give me any problems.

      Just want to know if anyone experience this kind of problem before. Anyway, thanks for your comment!

      Regards,
      Perl Beginner.

Re: Concurrent file access with flock()
by tilly (Archbishop) on Jun 18, 2004 at 01:30 UTC
    The error that you have is very often seen as a result of a race condition when your open mode is +> or if locking did not work. It should not happen if your mode is +< and you check for locking (which you do).

    Therefore it is odd that you are seeing the problem.

    But the fact that this code is clearly inline in the top of a script is suspicious. I would strongly suggest that you look through all of your scripts which access counter.txt and see if any of them has omitted locking or has the wrong open mode. I would further suggest that, rather than write the same locking code in each script, you write a function in a library and then call that function from everywhere. This reduces code duplication, and keeps you from having to look for problems like this in lots of places to make sure that you caught the last place that the bug is to be found.
Re: Concurrent file access with flock()
by ambrus (Abbot) on Jun 18, 2004 at 08:14 UTC

    If the script somehow dies between the truncate and the close command, the file will become empty. Are you by any chance printing something (the value of the counter) between the truncate and the close? If so, the script may die because it gets a SIGPIPE.