I think I understand whats going on here ... It seems to me that gethtml is never called.
Hm. Maybe not so much :)
The heart of the program is these four (extended) lines:
## Create the queue my $Qlinks = new Thread::Queue; ## Start the threads. my @threads = map { threads->create( \&getHTML, $Qlinks ); } 1 .. $noOfThreads; ## Fetch and parse the first page; queue the links listParse( $firstURL, $Qlinks ); ## Join the threads $_->join for @threads;
The second of those lines creates 10 threads each running an independant copy of getHTML() and each is passed a copy of the queue handle. Each thread sits waiting (blocking) reading the queue, for a link to become available. Ie. they do nothing until the next line runs.
listParse() also get a handle to the queue, and whenever it finds a link, it posts it to the queue, and one of the threads (it doesn't matter which as the are all identical) will get it and do its thing.
When listParse() has finished finding the links, it pushes one undef per thread and then returns.
The fourth line, waits for all the threads to finish, at which point the only thing left to do is exit.
In reply to Re^3: How to speed up my Html parsing program? (Concurrently run Subroutines?)
by BrowserUk
in thread How to speed up my Html parsing program? (Concurrently run Subroutines?)
by BobFishel
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |