in reply to Adding parallel processing to a working Perl script
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Adding parallel processing to a working Perl script
by Jim (Curate) on Apr 21, 2014 at 14:49 UTC | |
Thank you for your reply, zentara. Ah, this is my dilemma. I get profoundly odd, inconsistent behavior. First, here's a successful run with the forking code commented out:
Now, here's a sequence of runs, one immediately after the other, with the forking code restored:
You can plainly see the bizarre, inconsistent output from one run to the next. The last run stalled. In fact, it's still running as I type this, neither finishing nor producing more output. This is why I'd hoped some kind brother on PerlMonks—one who has much more experience using Parallel::ForkManager than I do—might look at this script and immediately recognize what's wrong with it. | [reply] [d/l] [select] |
by BrowserUk (Patriarch) on Apr 21, 2014 at 15:58 UTC | |
The problem is that each of the 'fork's, it re-opening the same glob for stdin. Under unix, this works because each fork is a new process, so the re-used glob is unique within its own process space. But under windows, each 'fork' is actually just a separate thread, within the same process space, so the re-used glob -- despite that it is cloned at the Perl level -- is trying to concurrently reuse the same underlying per-process OS buffers and data-structures; with the inevitable consequences. When it works, it is by pure chance. Mostly it won't. With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] [d/l] |