I'll look into handling the forking myself along with HTTP::MTTP or HTTP::GHTTP. Writing the responses to a file or database really isn't useful for me. The data I'm getting cannot be cached, so I have to make the request every time.
The reason I was considering the threaded model is that I'm running into memory constraints that seem to be related to prefork (one interpreter per child). This application is heavily bound on network i/o and during my limited testing with threading, I was able to accept a much larger number of requests concurrently.
In reply to Re^2: Parallel HTTP requests under mod_perl2
by StoneTable
in thread Parallel HTTP requests under mod_perl2
by StoneTable
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |