in reply to Re: Forking Multiple Threads
in thread Forking Multiple Threads

I do understand difference between thread an forks. First I am using WWW::Curl for downloading file, but the requirement says I need to use Threads and Forks. I have been using Parallel::Forkmanager as well in past but here I want Fork to fork a process with 100 threads and what ever operation thread is doing is stored in a global variable (Hash), so that at the end of program I can see what all has happened (if anything goes wrong). I hope I am able to answer your Query!

Replies are listed 'Best First'.
Re^3: Forking Multiple Threads
by Corion (Patriarch) on Feb 08, 2012 at 12:02 UTC

    That's not how fork works. There is nothing I can do to help you further. I recommend you learn about how fork works, and then find out a way how to store your data before the child program finishes to communicate with the parent. Maybe a logfile for each child works for you.

    Mixing threads and fork is something I highly advise against, and mixing database access and threads or fork also something to avoid.

    If you have "outside" (that is, homework) requirements to use both, threads and fork, the assignment likely was given to you that way so you learn about fork and threads and the difference and advantages and disadvantages of both. I recommend that you review your course material for more information or ask the person who gave you the requirement.

      I am not strict on using Fork and Threads at the same time. My problem is that I am currently Forking 500 Processes to run my Load and want to reduce on the number of processes due to hardware limitation. I have tried with Threads and I can see 100 threads are consuming lesser resources as compared to 100 Forked Processes. So thats my concern, how should I create 500 threads at the same time, when One process doesnot support more than 100 threads? Please please please help me my dear friend.
        One process doesnot support more than 100 threads?

        That limitation is due to the stupidly large default stack size allocated to each thread. You can overcome that by using the stack_size => 4096 detailed in the module's POD.

        This runs 500 threads, uses around 1.5 GB and heads 1000 urls in 24 seconds via my 2Mbs connection:

        #! perl -slw use strict; use LWP::Simple; use threads stack_size => 4096; use threads::shared; use Thread::Queue; our $THREADS //= 500; my %log :shared; my $Q = new Thread::Queue; my @threads = map async( sub { while( my $url = $Q->dequeue() ) { my @info = head 'http://' . $url; lock %log; $log{ $url } = join $;, map{ local $^W; $_ // '*n/a*' } @info; } } ), 1 .. $THREADS; while( <> ) { chomp; sleep 1 if $Q->pending > $THREADS; $Q->enqueue( $_ ); } $Q->enqueue( (undef) x $THREADS ); $_->join for @threads; my( $url, $status ); print "$url : $status" while ( $url, $status ) = each %log; __END__ c:\test>t-head-urls -THREADS=100 urls.list.small www.t-mobile.com : text/html; charset=utf-8?109511?873075813?*n/a*?Mic +rosoft-IIS/7.0 www.avocent.com : text/html; charset=UTF-8?114030?*n/a*?*n/a*?Microsof +t-IIS/7.0 www.fsw.com : text/html?9926?1306121940?*n/a*?Apache/2.2.21 (Unix) mod +_ssl/2.2.21 OpenSSL/0.9.7a mod_auth_passthrough/2.1 mod_bwlimited/1.4 + FrontPage/5.0.2.2635 mod_fcgid/2.3.5 Resin/3.1.10 Sun-ONE-ASP/4.0.3 www.voodoopc.com : text/html?6021?*n/a*?1328712575?Apache www.creative.com : text/html; Charset=iso-8859-1?42330?*n/a*?132870531 +8?Microsoft-IIS/6.0 www.belkin.com : www.argus-systems.com : text/html?8111?1320697393?*n/a*?Apache/2.2.6 ( +Unix) mod_python/3.3.1 Python/2.5.1 mod_ssl/2.2.6 OpenSSL/0.9.8g www.iss.net : text/html; charset=UTF-8?*n/a*?*n/a*?*n/a*?Apache www.Joyent.COM : text/html; charset=UTF-8?*n/a*?*n/a*?*n/a*?Apache ...

        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.

        The start of some sanity?

        Creating more (Perl) threads than you have CPUs / cores in your machine makes little sense. Use an asynchronous fetch library, like WWW::Curl and/or AnyEvent to fetch multiple URLs within one child process or thread, instead of spawning a separate thread or child process for each URL.