Thanks for all the replies. I do know about modules for portable locking, and I've read the man pages on flock. So I was really looking more for a script that would show the problem.
I did write a pair of scripts - IIRC they wrote log files of the number incremented in a file and then I checked if there were any duplicates reported between the two logs or any numbers out of sequence. All seemed as expected.
So either I was not testing it in a way to break it, or it is no longer an issue.
Perrin I did read tilly's articles
(here's one)
which states flock does not work for Linux. Since I can't make it not work I was looking for a script to show me that it fails.
Let's see if I still have the script... ah, here was my test script:
#!/usr/bin/perl -w
use strict;
use Fcntl qw(:DEFAULT :flock);
use Time::HiRes 'usleep';
use Devel::Peek;
open LOG, ">$$.log" or die $!;
while ( 1 ) {
open LOCK, "lock.file" or die "lock file $!";
die "$$ failed to get lock" unless flock(LOCK,LOCK_EX);
# perlfaq example
sysopen(FH, "numfile", O_RDWR|O_CREAT)
or die "can't open numfile: $!";
my $num = <FH> || 0;
chomp $num;
seek(FH, 0, 0) or die "can't rewind numfile: $!";
truncate(FH, 0) or die "can't truncate numfile: $!";
$num++;
(print FH $num, "\n") or die "can't write numfile: $!";
close FH or die "can't close numfile: $!";
print LOG "$num\t$$\n";
close LOCK;
usleep( 100 );
last if $num >= 100000;
}
Granted, that is not that demanding of a script.
I ran about four or five processes at the same time and then merged and sorted and made sure there were no duplicates or missing numbers in the logs.
Thanks,
BTW -- anyway to get replies on a perlmonks thread to notify me by mail?
|