geldmacher has asked for the wisdom of the Perl Monks concerning the following question:

Hello, I have a Perl script that is constantly running out of memory. It stores a HUGE tree of objects in memory, and as soon as the tree gets to be over 1 GB of RAM, Perl reports an "Out of memory!" and bails. The problem is that for the OS (HPUX 10.20), the max process size (read from ulimit(2)) is 2 GB. So this leads me to believe that the 1 GB constraint has something to do with Perl itself. Does anyone have any ideas how I can get over this obstacle? thanks, rusty

Replies are listed 'Best First'.
(jeffa) Re: Perl Process Size Limits?
by jeffa (Bishop) on Jul 11, 2001 at 01:17 UTC
    My boss just informed me that you should look into SPOPS (Standard Perl Object Persistance with Security), I will be looking into it as well. :)

    It's also available on CPAN: SPOPS

    Other options could be:

    • use MLDBM as it will store complex data structures
    • use a more lightweight object framework, like Class::MakeMethods::Template::Flyweight (which indexes out to data)
    • or store only keys to objects, which are autovivified from some persistent store
    SPOPS or an MLDBM would be ideal, but the last option is highly effective in the long-run, especially if you Memoize the object lookups in an LRU (Least Recently Used) cache.

    Jeff

    R-R-R--R-R-R--R-R-R--R-R-R--R-R-R--
    L-L--L-L--L-L--L-L--L-L--L-L--L-L--
    
Re: Perl Process Size Limits?
by toma (Vicar) on Jul 11, 2001 at 04:08 UTC
    You might check to see that HP-UX is reporting memory with the blocksize that you think it is. In some configurations it will reports memory usage in 2k blocks instead of 1k blocks.

    If you are growing a large array, there is an algorithm for adding more elements to the array which sometimes doubles the memory used by the array. As a rule of thumb I assume that about one third of the memory in arrays is wasted.

    You can fix this by pre-extending the array with an assignment statement:

    $#huge_array=10000000;
    This makes the length of the array 10 million without using perl's automatic array extender, so if you know how many array elements you have ahead of time, you don't have to waste the space.

    It should work perfectly the first time! - toma

      You might check to see that HP-UX is reporting memory with the blocksize that you think it is. In some configurations it will reports memory usage in 2k blocks instead of 1k blocks.

      Yes! You were right - HPUX was reporting in 512k blocks, so it was actually a 1GB limit on processes. I fixed this limit on 10.20 by recompiling Perl with the -N switch. Interestingly enough, on HPUX 11 you can fix this by running "chatr +q3p enable perl" on the perl executable. I haven't tried it yet so don't quote me on that, though it should work :)

      rusty
Re: Perl Process Size Limits?
by Anonymous Monk on Jul 11, 2001 at 01:19 UTC
    Maximum data size (RLIMIT_DATA) maximum memory (RLIMIT_RSS) and maximum virtual memory (RLIMIT_VMEM) can all limit memory, maybe one of them is set to 1 gig. Also, you've probably thought of this, but perhaps your machine is actually out of memory.
      but perhaps your machine is actually out of memory.

      Make that "out of swap space". (:

              - tye (but my friends call me "Tye")