in reply to Out of memory during "large" request

bh_perl:

Did you review the cygwin documentation about changing the memory limit? I don't know if cygwin still works that way (as the last time I used this setting was for cygwin v1.5x and now I'm on v1.7x). Give it a try and see if that relaxes your memory limit.

I just looked around and found a direct link to the documentation in question: http://cygwin.com/faq/faq-nochunks.html#faq.programming.adjusting-heap, so it looks like it's still relevent to the current version.

...roboticus

When your only tool is a hammer, all problems look like your thumb.

  • Comment on Re: Out of memory during "large" request

Replies are listed 'Best First'.
Re^2: Out of memory during "large" request
by chm (Novice) on Jan 04, 2011 at 02:27 UTC

    The default cygwin heap size is around 300MB so increasing that value will help. Some things I discovered with this:

    (1) If you set the value too high, things can lock up for cygwin or the system. Be sure to stay below the system swap space less the memory of any running processes.

    (2) cygwin perl seems to be built using the perl malloc which seems to request a pool of memory large enough for your request and then some. I've seen this result in an approximate 2X larger memory request from the system which can lead to OOM death much sooner than one might expect. At the very least, it can run out one large allocation earlier than one has requested...then boom.