P0w3rK!d has asked for the wisdom of the Perl Monks concerning the following question:
Can anyone elaborate on their experiences with forking() around with Perl in distributed cross-platform systems environment?
Say you're given 100 processes that you have to fork(), but the order in which they finish does not necessarily run 1..100, but more or less varies on a daily basis based on resources, number of machines to run scripts on, etc. Keep in mind that some of the processes may be dependent upon each other.
What is the best approach that my fellow brothers have taken to tackle this problem?
I am not looking for a solution, but more or less your adventure notes on the pitfalls of this topic.
My current solution is using a type of pooling mechanism to see what's done and what's not. Performing the management of processes in a sequential fashion (ie. system) will solve my problem, but I am trying to run things in a parallel mentality (ie. exec...but not exactly--->looking for alternative).
-P0w3rK!d
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Caught Forking() around again
by BazB (Priest) on May 17, 2002 at 17:54 UTC | |
|
Re: Caught Forking() around again
by yodabjorn (Monk) on May 17, 2002 at 21:13 UTC |