in reply to Re: Parallel HTTP requests under mod_perl2
in thread Parallel HTTP requests under mod_perl2

I'll look into handling the forking myself along with HTTP::MTTP or HTTP::GHTTP. Writing the responses to a file or database really isn't useful for me. The data I'm getting cannot be cached, so I have to make the request every time.

The reason I was considering the threaded model is that I'm running into memory constraints that seem to be related to prefork (one interpreter per child). This application is heavily bound on network i/o and during my limited testing with threading, I was able to accept a much larger number of requests concurrently.

  • Comment on Re^2: Parallel HTTP requests under mod_perl2

Replies are listed 'Best First'.
Re^3: Parallel HTTP requests under mod_perl2
by perrin (Chancellor) on Mar 08, 2006 at 13:49 UTC
    I just meant you could use the file as cheap IPC to get the responses in the parent process. There are other ways to do this though.

    Be careful what you measure in terms of memory. On a Linux system, much of the size of an httpd process is actually shared by copy-on-write. See the mod_perl docs for more info.