scmason has asked for the wisdom of the Perl Monks concerning the following question:

I have written a small script, with a rather specific task, that is it reads various sized files into memory, encodes them and then writes them to a socket. I developed these on a 1.4 GHZ Suse Linux box (2.4 kernel) with 512 mb of ram and everything works fine. When I take these scripts to the client, running AIX on RS6000 I get an out of memory error and the process dies after I read in about 120 mb. These machines have 512+ mb ram and were not even running X at the time. I have tested multiple scripts on these machines as well as my Linux cluster and our Solaris cluster. The only ones that run out of memory are the AIX machines and it always happens around 120mb. Has anyone else ran into this problem?

Replies are listed 'Best First'.
Re: AIX Memory problems
by blakem (Monsignor) on Oct 24, 2001 at 11:35 UTC
    My AIX-fu is a bit rusty, but it sounds like its configured with user or process level resource limits. You might want to look at /etc/security/limits and ulimit or smit.

    -Blake

      Or alternatively reconsider the algorithm/code which you using to encode your source data.

      The reason for this is that one thing that is not clear from your post (scmason) is whether your code is loading all of the source file into memory at once or whether the error that is occurring is after having read (in segments) 120Mb of the source file - Certainly if this figure represents concurrent memory usage, it may be worth posting your code for review to see how this level of memory usage can be reduced.

       

      Ooohhh, Rob no beer function well without!

        Hi, There was rather intense recurrence of data in the initial prototype, but I believe we stripped all of this out through code review. I also was able to speed up the encoding by encoding one file at a time, rather than all the data at once. I suffered this problem even after removing the encoding feature for testing. The problem came after reading the files incrementaly. Sure enough, in the limits section of security there was an enrty for datsize, 128 mb.
        Thanks,
        scmason
      Thanks! I was able to solve the problem by adding this line
      system("unlimit datasize");
      which worked fine, because the entire script executes in one shell.
      Thanks, scmason