in reply to Re: gigantic daemons
in thread gigantic daemons

Thanks for the information. I gather that only one copy of perl is actually loaded, am I right?

Running on Linux -- check
Same server (well, same host) -- check
Same parent -- check
Same data -- Well, not exactly, but the amount of data these daemons use is small.

I am not loading entire web pages, I am only doing HEAD requests, and one-at-a-time at that. The data storage is on a PostgreSQL server. Could it be I am misleading myself about actual memory comsumption? I am looking at "ps -aux" and "gnome-system-monitor". Is the memory shown perhaps actually not physically allocated (or swapped, maybe?).

Replies are listed 'Best First'.
Re: Re: Re: gigantic daemons
by esh (Pilgrim) on Sep 03, 2003 at 06:50 UTC

    I gather that only one copy of perl is actually loaded, am I right?
    Yes, one copy of the perl compiler/interpreter and probably one copy of your Perl source code (unless you "require", "do", or other similar things at run time).

    Could it be I am misleading myself about actual memory comsumption?
    Yes, it is possible that most of the parent/child memory listed by ps is shared amongst them.

    I tend to use the "top" command to watch the shared memory usage. It seems that ps should have an option to show shared memory, but I couldn't get it to work right off.

    What really matters is how much free memory you have on the system when everything is running. Try a before and after snapshot using something like "free" and make sure to look at the free momory +/- buffers/cache.

    -- Eric Hammond

      Thanks, Eric. I seem to get similar results with "free" and "top". I do, indeed, "use" several modules in the daemons, which must explain why they aren't shared as I think they should be. (Even so, 5.6MB seems excessive...). Thanks very much for the help.
        use does its magic at compile time. If the daemons are loaded before the fork, then they should be shared. Well, as much as the main script, anyway.