My AIX-fu is a bit rusty, but it sounds like its configured with user or process level resource limits. You might want to look at /etc/security/limits and ulimit or smit.
-Blake
| [reply] [d/l] [select] |
Or alternatively reconsider the algorithm/code which you using to encode your source data.
The reason for this is that one thing that is not clear from your post (scmason) is whether your code is loading all of the source file into memory at once or whether the error that is occurring is after having read (in segments) 120Mb of the source file - Certainly if this figure represents concurrent memory usage, it may be worth posting your code for review to see how this level of memory usage can be reduced.
Ooohhh, Rob no beer function well without!
| [reply] |
Hi,
There was rather intense recurrence of data in the initial prototype, but I believe we stripped all of this out through code review. I also was able to speed up the encoding by encoding one file at a time, rather than all the data at once. I suffered this problem even after removing the encoding feature for testing. The problem came after reading the files incrementaly. Sure enough, in the limits section of security there was an enrty for datsize, 128 mb.
Thanks,
scmason
| [reply] |
Thanks! I was able to solve the problem by adding this line
system("unlimit datasize");
which worked fine, because the entire script executes in one shell. Thanks, scmason
| [reply] |