2ge has asked for the wisdom of the Perl Monks concerning the following question:

Hello dear monks,

I'd like to programm parallel web spider and create small search engine using great fulltext SE Sphinxsearch. Now I am deciding, which modules I should use. It was long long time ago, when I worked with POE, so in general - it is good start ? I looked around and it seems for me would be best using for spidering POE::Component::Client::HTTP and for reading/writing to DB POE::Component::EasyDBI - or you suggest something else ?

I looked on POE webpage, there are nice examples, maybe someone of you have already some small example with HTTP spidering and storing those pages to DB/MySQL, or even there exists some Spider already, so I am asking you, not to reinvent the wheel.

Thank you!

Replies are listed 'Best First'.
Re: Parallel WEB spider, DB support
by Joost (Canon) on Aug 24, 2007 at 13:31 UTC
Re: Parallel WEB spider, DB support
by perrin (Chancellor) on Aug 24, 2007 at 13:10 UTC
    POE should work. The simplest approach is probably forking. You could use Parallel::ForkManager for this.
Re: Parallel WEB spider, DB support
by BrowserUk (Patriarch) on Aug 24, 2007 at 18:17 UTC

    Threaded solutions are possible, easy and very scalable.

    Niceties like not hitting the same server to frequently; only downloading a given url once; not retrying failing servers many times over; are all made easier by shared state.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.