We wanted to do something similar (but different). We wanted to stress test a dynamic site and see how many transactions per second per core we could achieve. These tests were for Linux/Unix only, so I'm not sure how to do this with Windows.
We were able to saturate a 100Mb ethernet interface with "use HTTP:Lite;", but this didn't stress our target system. We then used a perl loop to call 'system "wget . . . /&";' with the urls, and we were able to saturate a 1,000Mb ethernet interface and also stress test the application. All that this does is allow each "wget" to run in it's own address space and to be dispatched by the operating system independent of perl (fork may solve your requirements also). We used 16 urls and called them 1,000 times from the perl script.
In our case the url was the same, and the parameters send to the application were different,
but the results allowed us to stress test the application.
I don't think your problem is perl, but more that "wget" or "Siege" was optimized to get the url data using all the "bells and whistles" of the operating system.
Good Luck.
"Well done is better than well said." - Benjamin Franklin
|