in reply to Perl Process Size Limits?

You might check to see that HP-UX is reporting memory with the blocksize that you think it is. In some configurations it will reports memory usage in 2k blocks instead of 1k blocks.

If you are growing a large array, there is an algorithm for adding more elements to the array which sometimes doubles the memory used by the array. As a rule of thumb I assume that about one third of the memory in arrays is wasted.

You can fix this by pre-extending the array with an assignment statement:

$#huge_array=10000000;
This makes the length of the array 10 million without using perl's automatic array extender, so if you know how many array elements you have ahead of time, you don't have to waste the space.

It should work perfectly the first time! - toma

Replies are listed 'Best First'.
Re: Re: Perl Process Size Limits?
by geldmacher (Initiate) on Jul 11, 2001 at 18:33 UTC
    You might check to see that HP-UX is reporting memory with the blocksize that you think it is. In some configurations it will reports memory usage in 2k blocks instead of 1k blocks.

    Yes! You were right - HPUX was reporting in 512k blocks, so it was actually a 1GB limit on processes. I fixed this limit on 10.20 by recompiling Perl with the -N switch. Interestingly enough, on HPUX 11 you can fix this by running "chatr +q3p enable perl" on the perl executable. I haven't tried it yet so don't quote me on that, though it should work :)

    rusty