warthurton has asked for the wisdom of the Perl Monks concerning the following question:
I do not want to do this in the qmail since it would restrict the rest of the site for # of concurrent deliveries.
What I'm thinking about is some way to check how many copies of the script are running and if it is > x then wait until it is <= x and then continue to process.
Some possibilities are checking the process list (but that could cause multiples to still start running at the same time) or writing a file out at start of processing and deleting it at the end (but what if the removal fails, then maybe check a timestamp).
People do this with pid files quit often, but I'm not sure of the best way to process them.
Has anyone ever had to restrict # of concurrent processing copies of a program? What have you done? What has worked? What hasn't? Did you setup queues?
Thanks for any ideas.
W
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Maximum # of concurrent runs
by halley (Prior) on Aug 19, 2003 at 16:19 UTC | |
by warthurton (Sexton) on Aug 19, 2003 at 16:22 UTC | |
by merlyn (Sage) on Aug 19, 2003 at 17:07 UTC | |
|
Re: Maximum # of concurrent runs
by dragonchild (Archbishop) on Aug 19, 2003 at 16:17 UTC | |
|
Re: Maximum # of concurrent runs
by MidLifeXis (Monsignor) on Aug 19, 2003 at 17:32 UTC | |
by warthurton (Sexton) on Aug 20, 2003 at 21:31 UTC | |
|
Re: Maximum # of concurrent runs
by esh (Pilgrim) on Aug 19, 2003 at 16:46 UTC | |
|
Re: Maximum # of concurrent runs
by BrowserUk (Patriarch) on Aug 19, 2003 at 17:08 UTC | |
|
Re: Maximum # of concurrent runs
by esh (Pilgrim) on Aug 19, 2003 at 21:40 UTC | |
by warthurton (Sexton) on Aug 20, 2003 at 21:29 UTC | |
|
Re: Maximum # of concurrent runs
by davido (Cardinal) on Aug 19, 2003 at 18:25 UTC | |
by merlyn (Sage) on Aug 19, 2003 at 18:51 UTC | |
|
Re: Maximum # of concurrent runs
by TomDLux (Vicar) on Aug 19, 2003 at 18:06 UTC | |
|
Re: Maximum # of concurrent runs
by johndageek (Hermit) on Aug 19, 2003 at 21:14 UTC |