in reply to Out Of Memory error at 950MB with 14GB free RAM

Why the ancient perl? You can try Configuring with -Uusemymalloc and see if that makes a difference. Or try perl5.8.3; I think there were some improvements in perl's malloc (what you get when mymalloc is set) but can't recall offhand the details (other than malloc.c got it's missing LotR quote added).

Update: http://groups.google.com/groups?threadm=m3d6jsp4eq.fsf%40franz.ak.mind.de would seem to indicate perl's malloc does have a 1GB limit (though I didn't see an actual authoritative statement to that effect in that thread.)

I'd encourage you to submit a perl bug report (see perldoc perlbug), and try -Uusemymalloc. I note that you are not using 64bit pointers, so you are going to have at very best a 4GB (any quite possibly a 2GB) limit anyway. See README.hpux for information on building 64-bit perl.

  • Comment on Re: Out Of Memory error at 950MB with 14GB free RAM

Replies are listed 'Best First'.
Re: Re: Out Of Memory error at 950MB with 14GB free RAM
by aburker (Sexton) on Feb 10, 2004 at 10:00 UTC
    wow!

    I really didn't expect that much answers past one week of reaserch on that topic and finding not to much, THANKS!

    your link really helped and brought me to the conclusion that it is very likely that the perl usemymalloc switch which is set on the server is causing the problem.

    Unfortunately the sysadmin won't recompile perl without the switch (he will only use prepacked packages provided by HP)

    Probably he will do an upgrade to 5.8 but this can take time... and (after reading the postings in your link) this will not fix the problem!

    other answers: *) I can't map this to harddisk (performance issue)
    *) I know 950MB is a lot of RAM but I will have to build up a complex Structure made up of small strings. This structure must be FAST handable. So that is why the RAM gets big!
    *) And normally RAM is not the problem (why would someone by a server with 8CPUs and 16GB RAM if it's not for performance..., Just bad that perl can't keep scope with that!
    *) the 4GB border is not the problem my problem would just need 10% more memory to handle the largest file, but by now all possible tweaks are already done (as I found them :-)

    conclusion:
    +++++++++++
    I will have to rewrite the program and make it slower, but this seems to be less days of work then recompiling perl and checking all other scripts :-((((((

    But anyway thanks for your response!

      It's possible you're being bitten by the HP-UX caveat of maxdsiz/maxdsiz64. This variable limits the maximum size any single process can grow to, and no amount of recompiling will fix it. Run "kmtune -q maxdsiz" to see whether the process limit is causing this. If so, then fiddling with kmtune or "Kernel Configuration" from within SAM will allow you to change it.


      davis
      It's not easy to juggle a pregnant wife and a troubled child, but somehow I managed to fit in eight hours of TV a day.
      Unfortunately the sysadmin won't recompile perl without the switch (he will only use prepacked packages provided by HP)
      Point the sysadmin to the HP-UX Porting Centre. (It's not immediately clear to me if the perl-5.8.3 package there is 64-bit or not.)
      Might you just build your own userland build of perl, just for this program?