in reply to Multithreaded memory usage

Any ideas?

Two ideas actually. The first is to search for memory leaks with something like Devel::Cycle or Devel::Leak - maybe you just have a circular reference somewhere that prevents perl from destroying the objects.

The second idea is to spawn a new threadprocess for each task, and let the operating system clean up the memory for you.

Replies are listed 'Best First'.
Re^2: Multithreaded memory usage
by bingohighway (Acolyte) on Apr 21, 2009 at 08:04 UTC
    Ok, I'll try that. I am assuming once a detached thread reaches the end of its code block it should automatically quit and the OS should recover that memory? (assuming it is coded correctly :-) )

    Cheers

      I am assuming once a detached thread reaches the end of its code block it should automatically quit and the OS should recover that memory?

      I'm really not an expert here, but I don't think that's the case, at least not always. The OS probably doesn't have any idea about which parts of the memory is associated to which user-level thread, and thus can't clean up. It does know about processes though, which is why I recommended them over threads.

        I'm a little confused here. I'm currently using the $thread->create() method. Isn't that already spawning a new thread? Would a new process be something like using backticks to execute another Perl script?

        Cheers

        P.S did you go to St Andrews to do Physics at all?

Re^2: Multithreaded memory usage
by why_bird (Pilgrim) on Apr 21, 2009 at 09:45 UTC
    The second idea is to spawn a new thread for each task

    Did you mean process rather than thread?

    update: p.s. hurrah for physicists (especially those forgetting everything they learnt by doing a completely unrelated job) :)

    ........
    Those are my principles. If you don't like them I have others.
    -- Groucho Marx
    .......
      yes, that's what I meant. Sorry for the confusion.
Re^2: Multithreaded memory usage
by bingohighway (Acolyte) on Apr 21, 2009 at 13:25 UTC
    Okay,

    I have tried the for method and after about 60 forks (over 5 mins) the parent process can't fork any more processes. I have tried setting $SIG{CHLD} = 'IGNORE' to ignore the zombified children, but to no avail.

    Any ideas where to go to next?

    Cheers!

      On Windows, start the subprocess not via fork() but better via system(1, @args) or via system("start @args");. That way, it is dissociated from the main Perl program.

      wait (or waitpid) for them (perhaps in $SIG{CHLD} to be nearly non-blocking) so that they don't become zombies? And what's the error message?
        This is a simplified code version, it can't fork any more processes after 64 loops ($pid is undefined):
        use LWP::Simple; my @pid; for(my $i = 0; $i < 720; $i++){ print "$i\n"; $pid[$i] = fork(); if (not defined $pid[$i]) { print "resources not avilable.\n"; } elsif($pid[$i] == 0) { get_data_and_go(); exit(0); } else{ sleep 2; }; }; sub get_data_and_go { does some webpage stuff and exits exit(0); };
        Question is will the wait function actually pause the program? (which is what I don't want).

        Cheers!