in reply to Re^3: Forking Multiple Threads
in thread Forking Multiple Threads

I am not strict on using Fork and Threads at the same time. My problem is that I am currently Forking 500 Processes to run my Load and want to reduce on the number of processes due to hardware limitation. I have tried with Threads and I can see 100 threads are consuming lesser resources as compared to 100 Forked Processes. So thats my concern, how should I create 500 threads at the same time, when One process doesnot support more than 100 threads? Please please please help me my dear friend.

Replies are listed 'Best First'.
Re^5: Forking Multiple Threads
by BrowserUk (Patriarch) on Feb 08, 2012 at 12:55 UTC
    One process doesnot support more than 100 threads?

    That limitation is due to the stupidly large default stack size allocated to each thread. You can overcome that by using the stack_size => 4096 detailed in the module's POD.

    This runs 500 threads, uses around 1.5 GB and heads 1000 urls in 24 seconds via my 2Mbs connection:

    #! perl -slw use strict; use LWP::Simple; use threads stack_size => 4096; use threads::shared; use Thread::Queue; our $THREADS //= 500; my %log :shared; my $Q = new Thread::Queue; my @threads = map async( sub { while( my $url = $Q->dequeue() ) { my @info = head 'http://' . $url; lock %log; $log{ $url } = join $;, map{ local $^W; $_ // '*n/a*' } @info; } } ), 1 .. $THREADS; while( <> ) { chomp; sleep 1 if $Q->pending > $THREADS; $Q->enqueue( $_ ); } $Q->enqueue( (undef) x $THREADS ); $_->join for @threads; my( $url, $status ); print "$url : $status" while ( $url, $status ) = each %log; __END__ c:\test>t-head-urls -THREADS=100 urls.list.small www.t-mobile.com : text/html; charset=utf-8?109511?873075813?*n/a*?Mic +rosoft-IIS/7.0 www.avocent.com : text/html; charset=UTF-8?114030?*n/a*?*n/a*?Microsof +t-IIS/7.0 www.fsw.com : text/html?9926?1306121940?*n/a*?Apache/2.2.21 (Unix) mod +_ssl/2.2.21 OpenSSL/0.9.7a mod_auth_passthrough/2.1 mod_bwlimited/1.4 + FrontPage/5.0.2.2635 mod_fcgid/2.3.5 Resin/3.1.10 Sun-ONE-ASP/4.0.3 www.voodoopc.com : text/html?6021?*n/a*?1328712575?Apache www.creative.com : text/html; Charset=iso-8859-1?42330?*n/a*?132870531 +8?Microsoft-IIS/6.0 www.belkin.com : www.argus-systems.com : text/html?8111?1320697393?*n/a*?Apache/2.2.6 ( +Unix) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.6 OpenSSL/0.9.8g www.iss.net : text/html; charset=UTF-8?*n/a*?*n/a*?*n/a*?Apache www.Joyent.COM : text/html; charset=UTF-8?*n/a*?*n/a*?*n/a*?Apache ...

    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.

    The start of some sanity?

Re^5: Forking Multiple Threads
by Corion (Patriarch) on Feb 08, 2012 at 12:24 UTC

    Creating more (Perl) threads than you have CPUs / cores in your machine makes little sense. Use an asynchronous fetch library, like WWW::Curl and/or AnyEvent to fetch multiple URLs within one child process or thread, instead of spawning a separate thread or child process for each URL.