The problem isn't in Perl. Your mutual exclusion is incorrect. (Obviously ;-)
It's OK to allow programs to read the contents of a file with shared locks, but writers must take exclusive locks -- which you're doing. However, if your exclusive lock attempt fails, you need to re-read the file. If you can't take an exclusive lock, someone else is writing to the file, and your in-memory copy is no good.</p<
In your snippet above, when Process 2 fails to get an exclusive lock, it can't just wait to get the lock, then clobber it anyway. It needs to start from scratch because when the lock fails, it has no idea what the state of the file is.
HTH
Update
Forgot what I was going to suggest strongest of all: if you have the time/resources, ditch the files in favor of a relational DBMS with transaction support....
In reply to Re: File locking, lock files, and how it all sucks
by VSarkiss
in thread File locking, lock files, and how it all sucks
by tocie
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |