in reply to Parallel HTTP requests under mod_perl2

The threaded model generally has significantly worse performance because of the way Perl threads work. On the mod_perl list, we recommend prefork for those that can use it (i.e. everyone but Win32).

More likely, LWP::Parallel::UserAgent is just not very fast. I suggest you try using a fast one like HTTP::MHTTP or HTTP::GHTTP and switching to a forking model where you fork (yes, it's okay to fork from mod_perl) and write the responses back to a file or database. A prefork model is also possible but harder to code all the IPC stuff for.

  • Comment on Re: Parallel HTTP requests under mod_perl2

Replies are listed 'Best First'.
Re^2: Parallel HTTP requests under mod_perl2
by StoneTable (Beadle) on Mar 08, 2006 at 13:33 UTC

    I'll look into handling the forking myself along with HTTP::MTTP or HTTP::GHTTP. Writing the responses to a file or database really isn't useful for me. The data I'm getting cannot be cached, so I have to make the request every time.

    The reason I was considering the threaded model is that I'm running into memory constraints that seem to be related to prefork (one interpreter per child). This application is heavily bound on network i/o and during my limited testing with threading, I was able to accept a much larger number of requests concurrently.

      I just meant you could use the file as cheap IPC to get the responses in the parent process. There are other ways to do this though.

      Be careful what you measure in terms of memory. On a Linux system, much of the size of an httpd process is actually shared by copy-on-write. See the mod_perl docs for more info.