Clear questions and runnable code get the best and fastest answer |
|
PerlMonks |
Re^3: Multithread Web Crawlerby xuqy (Initiate) |
on Sep 23, 2005 at 13:40 UTC ( [id://494528]=note: print w/replies, xml ) | Need Help?? |
Thank you so much. I do tried Parallel::ForkManager but come up with a puzzle: How to share data between processes? To avoid crawling the same page repeatly, a global tied hash has to be shared by all the crawling processes. I experimented and found that all the forked processes just ended up with the same crawling history.
Can you do me a favor to suggest a patch to it?
In Section
Seekers of Perl Wisdom
|
|