in reply to Re: LWP::Parallel vs. HTTP::GHTTP vs. IO::Socket
in thread LWP::Parallel vs. HTTP::GHTTP vs. IO::Socket
My concern here is that I'll have an array and some hashes that have urls that are seen, unseen, down, bad, and so on.. and I need to make sure that the process putting urls into the hashes and arrays (as links are yanked from the pages in %seen) can be fetched by processes already in fork() or registered in parallel. Would this require some sort of shared memory to get working properly? Can a forked process read and write to an array or hash created by the parent of the fork?
I've got a lot of this code "functioning", but now is the time to refactor and get the performance up to speed (pun intended) for a production distribution of the tool.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: LWP::Parallel vs. HTTP::GHTTP vs. IO::Socket
by mp (Deacon) on May 16, 2003 at 16:42 UTC | |
by hacker (Priest) on May 16, 2003 at 23:21 UTC |