That shouldn't be much of a problem. It divides up nicely into the classic "consumers" and "producers" scenerio.
Producers push stuff onto the queue, consumers take things off of it
First you have several producers (namely people submitting jobs) They dump job requests into a file, making sure they dont step on each others toes using file flocking.
Then you have the consumers... they consume the job requests. You can either have a single process read the
job file and manage a bunch of subprocesses, or have each consumer process deal with fetching its own requests. I'd probably opt for having one manager process that takes care of reading the job queue and handing the requests off to
the child processes. This is where Parallel::ForkManager comes in handy.
So you really have two programs... one used to submit the job request to the queue, and another to read the list and manage the children that will run the requests in parallel. Does that make sense?
-Blake
|