jettero has asked for the wisdom of the Perl Monks concerning the following question:

I have a program with an evil memory leak. I think it's got a hash that get's bigger and bigger while it runs. It takes a few days before it get's outta hand.

While I was in the process of trackin' that down, I got to thinkin'. This particular program forks lotsa kids. They all die properly, but because of the stupid memory leak, it's a little like opening netscape recursively.

I don't need all the different parts of the program during the fork though. Is there a way (in perl) to limit the what parts of the program get copied in a fork? I know there's a way to do it in C (vfork/threads), but can ya do that stuff in perl?

Replies are listed 'Best First'.
RE (tilly) 1: Memory
by tilly (Archbishop) on Sep 03, 2000 at 18:27 UTC
    Don't worry about it. Even in C there are good reasons to prefer fork() over vfork(). Here is a link to a relevant discussion.

    Basically in Linux or any modern Unix, fork() will be implemented using mmap(). You do not copy any data, you do not create new data. You just mark everything copy on write and spawn a new process. Memory is allocated only once it starts to use it.

    That said, Perl could do a better job of separating out data in use from code. (Right now when you start to write you do entirely too much copying.) I believe that is on the wishlist for 6.0.

Re: Memory
by cianoz (Friar) on Sep 03, 2000 at 17:21 UTC
    perl has threads,
    outside that there is no way to control how the process is forked, it just get copied.
    if you don't need these large variables in the child processes you can allways undef() them just after the fork()...
Re: Memory
by Anonymous Monk on Feb 20, 2002 at 16:15 UTC
    Is there a way to specify a memory size for perl to use much like the java -Xm option? I get "out of memory" message. Thanks. Phil Liu pliu@cifunds.com