in reply to Perl threads to open 200 http connections
You are creating 200 new instances of cmd.exe, they create 200 new instances of wget. I think you run out of memory or other resources. Also, hammering 200 connections against a single server is not very friendly. Unless you have a very good network connection, this will most likely saturate your network connection.
forking breaks
No, forking is not implemented. Perl on Windows has a pseudo-fork giving you a new interpreter thread for each fork. I think your process runs out of (interpreter) threads. I don't know how exactly exec() is implemented on Windows, but since the Windows API doesn't have exec(), it must be emulated using CreateProcess() and some code that waits for the spawned process to exit.
I would like to see how "forking breaks". Show the code and the error from $!.
The funny thing here is that you don't need more than a single process with a single thread to start 200 instances of wget. Just create all those processes in a loop, making sure not to wait for them until you need to. On Unix, you would just fork them and remember the PIDs, then wait until you saw all child processes exit by handling SIGCHLD. on Windows, you would use system(1,...) if you don't care about exiting before your children do, or Win32::Process::Create() (from Win32::Process) instead of fork, and poll Win32::Process::Wait() instead of handling SIGCHLD.
Alexander
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Perl threads to open 200 http connections
by robrt (Novice) on Aug 03, 2010 at 11:11 UTC | |
by BrowserUk (Patriarch) on Aug 03, 2010 at 11:31 UTC |