in reply to Re: Re (tilly) 5: Parallel Downloads using Parallel::ForkManager or whatever works!!!
in thread Parallel Downloads using Parallel::ForkManager or whatever works!!!
An incidental conceptual misunderstanding that I see is that you are assuming that DOCUMENT_RETRIEVER will have a useful return in the parent. It won't, but since you don't use that it shouldn't be causing problems that you see (yet). However what this means is that children and parents will need to figure out how to communicate, and the odds are pretty good that it will be through external files.
And an incidental note. Most people who like to be called things like "Perl guru" aren't. In general I have found that people who think of themselves as being really good do so because they have never been in the larger pond of good people. But without that experience they have had to invent things themselves, which means that they may be better than their friends, but they are not going to be very good next to a random person who has absorbed "standard good advice".
And a final note. Parallel processing like this with many processes works best when you are doing things where the bottleneck is I/O. If you are doing computationally intensive work, then it is preferable to run only as many processes as you have CPUs. Because of this I would suggest that you rethink your design. It is probably going to make sense to have one loop where you download your files in parallel, and then have another loop where you do the complex processing serially.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Re (tilly) 7: Parallel Downloads using Parallel::ForkManager or whatever works!!!
by jamesluc (Novice) on Jan 09, 2002 at 19:53 UTC |