Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things

Out of memory during "large" request

by bh_perl (Monk)
on Dec 31, 2010 at 11:59 UTC ( #879927=perlquestion: print w/replies, xml ) Need Help??

bh_perl has asked for the wisdom of the Perl Monks concerning the following question:


I have run my perl program from cygwin and i got error message as below:
Out of memory during "large" request for 134221824 bytes, total sbrk() + is 321808384 bytes at ./ line 40.

Why this is happened ?.. my code is OK.

Thank you

Replies are listed 'Best First'.
Re: Out of memory during "large" request
by zentara (Archbishop) on Dec 31, 2010 at 12:54 UTC
    my code is OK

    Then what is the problem?

    If you Google for "Out of memory during "large" request total sbrk()" you will get your answers.

    Without seeing your code, I'm using the ESP module to guess that you are trying to upload a file too big for your upload_size_limit on the server.

    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh
Re: Out of memory during "large" request
by ww (Archbishop) on Dec 31, 2010 at 12:55 UTC
    It appears to me that your program has asked for additional memory at line 40 and that the machine doesn't have that much available.

    Down in the gloomy bowels of the underlying C, sbrk() increments the program's data space by increment bytes. See setlimit(2) in your man pages.

    Caveat: This may not be very helpful in solving your problem; with a bit more information, we may be able to offer some possible workarounds.

Re: Out of memory during "large" request
by roboticus (Chancellor) on Dec 31, 2010 at 14:43 UTC

      The default cygwin heap size is around 300MB so increasing that value will help. Some things I discovered with this:

      (1) If you set the value too high, things can lock up for cygwin or the system. Be sure to stay below the system swap space less the memory of any running processes.

      (2) cygwin perl seems to be built using the perl malloc which seems to request a pool of memory large enough for your request and then some. I've seen this result in an approximate 2X larger memory request from the system which can lead to OOM death much sooner than one might expect. At the very least, it can run out one large allocation earlier than one has requested...then boom.

Re: Out of memory during "large" request
by runrig (Abbot) on Dec 31, 2010 at 15:51 UTC
    You're right, line 40 is ok, but you set your $frobnitz parameter too high at line 39.
      Hahaha! $frobnitz - hilarious!
Re: Out of memory during "large" request
by Khen1950fx (Canon) on Dec 31, 2010 at 20:57 UTC
    From perldiag:

    Out of memory during "large" request for %s

    (F) The malloc() function returned 0, indicating there was insufficient remaining memory (or virtual memory) to satisfy the request. However, the request was judged large enough (compile-time default is 64K), so a possibility to shut down by trapping this error is granted.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://879927]
Approved by ww
Front-paged by Arunbear
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others cooling their heels in the Monastery: (7)
As of 2023-02-06 20:55 GMT
Find Nodes?
    Voting Booth?
    I prefer not to run the latest version of Perl because:

    Results (36 votes). Check out past polls.