in reply to Maximum # of concurrent runs

I think that a simple solution could be to have a process handle file. Do the following:

Create a text file, with six lines, each containing the word HANDLE.

Next, when your script executes, it should:

* Lock the handle file.
* Open the handle file.
* Read in the handle file.
* If there are no "HANDLE"s left, exit.
* If there are "HANDLE"s left, pop one off the bottom of the file.
* Write out the file.
* Close it.
* Unlock it.

Then do whatever work you intended to do within the script. Upon completion of the work, do the following:

* Lock and open the HANDLE file.
* Push your HANDLE back into the end of the file.
* Close and unlock the file.
* exit.

Oh, and if it finds HANDLE file already locked, just wait a second and try again.

It's a pretty simple method. If you want to increase or decrease the number of simultaneous processes, you just alter the number of handles in the file. ...the file could just as easily contain a counter number instead of a series of "handles". Each process decrements the counter, runs, then increments the counter. If the counter ever hits zero, no more processes can run. Same basic concept.

Dave

"If I had my life to do over again, I'd be a plumber." -- Albert Einstein

Replies are listed 'Best First'.
•Re: Re: Maximum # of concurrent runs
by merlyn (Sage) on Aug 19, 2003 at 18:51 UTC
    This mechanism assumes that people didn't do nasty things like "kill -9" the process (which people shouldn't do anyway, but that cargo cult solution gets repeated all the time).

    A better solution would be something that gets reset automatically by the operating system even if the process stops dead in its tracks. I give an "only-one" solution in my "highlander" column, which could be extended to "only six" with a bit of cleverness. In fact, I have that bit of cleverness scheduled for a future column idea. {grin}

    -- Randal L. Schwartz, Perl hacker
    Be sure to read my standard disclaimer if this is a reply.