debugger has asked for the wisdom of the Perl Monks concerning the following question:

Suppose a script a.pl uses a directory 'xyz' to create certain files say 'a.log' and 'b.log' in it and write to them.
While if a.pl is running, if a copy of a.pl is fired some other location which utilizes the same directory 'xyz' to create the a.log and b.log files then, my requirement is that the copy of a.pl should not be able to access dir 'xyz'and should exit.
Suggest a solution to lock the directory while in use and also to check whether its locked.
Suggest solution that will work on windows as well as linux.
Note:
1. Its not possible to use two different directories for different instances of run . (as a part of requirement).

Replies are listed 'Best First'.
Re: Directory locking
by cdarke (Prior) on Feb 16, 2010 at 13:09 UTC
    Use a lock file, a semaphore, or any other queuing method. What have you tried so far yourself? Or is this homework?
Re: Directory locking
by Illuminatus (Curate) on Feb 16, 2010 at 14:52 UTC
    1. Do you really want to lock the whole directory, or just ensure one copy of a.pl is running at a time?
    2. Will the files always be created/overwritten on execution of a.pl, or can they be appended to?
    3. Will there be other programs/scripts/etc that create/write/append to these log files?
    You could either use flock or use sysopen with O_CREAT|O_EXCL on a file a.pl.lck in that directory. Handling error conditions is left as an exercise to the writer.
      I want to lock the whole directory.
      Yes the files will always be appended(if present) and created and written(if not present).
      No there will be no other scripts that create/write/append to these log files.