in reply to Allocating more memory to perl process...
Generally you can count on being able to get to at least 1G, usually 2G, and on a big machine with a perl that doesn't use the perl malloc (which has a 2G limitation) you should be able to use all the memory your system has available to it.
There are some bits of perl that require contiguous memory, so you may find yourself facing out of memory situations before you actually run out of memory, but these are reasonably uncommon, and usually show up with exceedingly large arrays, or hashes with an extraordinary number of keys in it. (Which does happen from time to time. Or $time[2*time()], if you'd rather :)
Are you having memory issues? If so, details of OS version, perl version, and some symptoms might help figure out what's wrong.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Re: Allocating more memory to perl process...
by Bamafan (Initiate) on Nov 13, 2002 at 00:59 UTC | |
by Elian (Parson) on Nov 13, 2002 at 02:06 UTC | |
by seattlejohn (Deacon) on Nov 13, 2002 at 02:02 UTC |