However, that is of course not what you would want. Assuming that you only are interested in locks that are made by your own perl programs, you could implement some kind of lock list in a file on disk.
I would probably go about making a lock() module that all your programs use with functions to lock and unlock.
The lock/unlock functions would take, as usual, a filename of the file to lock and unlock. When invoked, lock first would write some identifier (perhaps the name of the invoking script and a timestamp?) to a textfile, probably named by the filename + some extension. Be careful to ensure this file will not exist. Then the file would be locked as usual. When unlocking, the same thing happens, the entry is removed from the textfile and the lock is released. At all times, you could have some function read the entries from this file.
There is one more thing. You should lock and unlock this textfile as well, before reading and updating it, otherwise there will be disaster. :) But because this will be a really fast operation, this should not cause a problem.
The rest you should be able to read up on with: perldoc -f flock and perldoc perlfaq5.
This is not necessarily a very elegant solution, but if you expect to be waiting for locks a lot, you are interested in monitoring this, and you do have control over all the (perl) programs that will do the locking, this should probably work. I think. Someone bash me over the head with my faulty reasoning otherwise...
Sorry I am not providing any example code, you can choose to look at that as an exercise, or take it for what it is - I am tired and a bit lazy. :)
In reply to A double locking scheme perhaps?
by Dog and Pony
in thread perl flock queue dumping?
by Malus
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |