Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I am just getting myself familiar with threading but I am not sure it can help trigger simultaneous 100 http requests. We are launching a site with a 100 Oracle license limit next week. I was hoping to write a script using WWW::Mechanize which would simulate 99 users accessing the site at the same time for load testing just to analyze the server's response (Linux RedHat with 8 processors). Is this the way to go? Any pointers to how other people have done this before? Thank you all.
  • Comment on Load testing/Simultaneous HTTP requests

Replies are listed 'Best First'.
Re: Load testing/Simultaneous HTTP requests
by rhesa (Vicar) on Mar 04, 2006 at 13:52 UTC
    LWP::Parallel::UserAgent might help you out.

    For a quickstart, Apache comes with a simple benchmarking utility called ab. You can use that to do a number of concurrent requests, like ab -c 99 -n 500 http://your.server/url.
    ab is not very useful if you need detailed information about the responses on the client side. I use it mainly to see how the server behaves to many concurrent requests.

Re: Load testing/Simultaneous HTTP requests
by BrowserUk (Patriarch) on Mar 04, 2006 at 16:34 UTC

    Unless you run your loadtest script on a machine with 99 processors, you ae not going to hit the server at exactly the same time regardles of whether you use processes or threads of asyncIO.

    Of course, even if you had 99 processors, the variablility in network responses, the bottleneck of the tcpip stack and interface card, congestion delays whether floydd or BEB, all mean that you aren't going to hit at exactly the same time.

    You would probably need to run 100 "users" on 10 machines for several thousand cycles before you would come close to 100 simultaneous requests, but you're probably more concerned with handling 100 concurrent requests?

    This will come close to your requirements. You may need to adjust the delay factor (0.1), to ensure that 100 threads are all spawned and ready to go at the same time. The simple trace will tell you how close to simultaneous the requests were issued. Your log file will tell you the rest. You could also make it so that it would delay to a specified time of day and (provided your network is timesynced), run multiple copies on different machines to get a more realistic test. Anyway, it's a simple starting point.

    #! perl -slw use strict; use threads; use Time::HiRes qw[time sleep]; use LWP::Simple; sub hitEm { my( $url, $when ) = @_; sleep $when - time; printf "%3d : %s\n", threads->tid, time; get $url; } my( $users, $url ) = @ARGV; my $when = time + 0.1 * $users; my @users = map{ threads->create( \&hitEm, $url, $when ); } 1 .. $users; sleep $when - time; $_->join for @users; __END__ c:\test>534459 10 http://news.bbc.co.uk/ 5 : 1141489281.46878 6 : 1141489281.46877 7 : 1141489281.46877 10 : 1141489281.46877 8 : 1141489281.48441 9 : 1141489281.48439 3 : 1141489281.48439 1 : 1141489281.4844 2 : 1141489281.48439 4 : 1141489281.4844

    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
Re: Load testing/Simultaneous HTTP requests
by perrin (Chancellor) on Mar 04, 2006 at 15:37 UTC
    Although somewhat difficult to use, httperf is much better when you want many simultaneous connections.