I also forgot to mention that multiple people will be
submitting these jobs from different instances of the same
Perl program. So I think I'll need some text file holding
the queue. | [reply] |
That shouldn't be much of a problem. It divides up nicely into the classic "consumers" and "producers" scenerio.
Producers push stuff onto the queue, consumers take things off of it
First you have several producers (namely people submitting jobs) They dump job requests into a file, making sure they dont step on each others toes using file flocking.
Then you have the consumers... they consume the job requests. You can either have a single process read the
job file and manage a bunch of subprocesses, or have each consumer process deal with fetching its own requests. I'd probably opt for having one manager process that takes care of reading the job queue and handing the requests off to
the child processes. This is where Parallel::ForkManager comes in handy.
So you really have two programs... one used to submit the job request to the queue, and another to read the list and manage the children that will run the requests in parallel. Does that make sense?
-Blake
| [reply] |
Thank you. This should help a lot. Another thing I forgot
to mention.....The jobs are submitted on NT and the process
that is running on a UNIX box. Will there be any conflicts
here? I was just planning on using telnet to submit the
jobs.
| [reply] |