in reply to Allocating more memory to perl process...

As far as I know, Perl on Linux, BSD and OS X systems will happily allocate memory it gets terminated by the scheduler after nearing 100% usage. I have 1.5 GB of RAM and have had perl scripts go up to 90% memory usage without requiring any special command line switch.

Could you provide more information about your operating system, and why you aren't getting the memory you need? Part of Perl programming is supposed to be never worrying about memory until you run out, but these issues are heavily OS-dependent.

  • Comment on Re: Allocating more memory to perl process...

Replies are listed 'Best First'.
Re: Re: Allocating more memory to perl process...
by Bamafan (Initiate) on Nov 12, 2002 at 23:45 UTC
    I'm implementing a recursive subroutine to find all the permutations of elements in a list of lists. You can look here to find out more on that subroutine.

    Unfortunately, with the way the subroutine is implemented, the interpreter complains when a certain level recursion is reached. Inside the debugger, it complains when the 100th level of recursion is hit and exits("100 levels deep in subroutine calls!" is the error message).

    It's not like the elements I'm using are particularly memory intensive. A conservative estimate would be that 2,000 characters are being pushed around at most at any one time. So it may actually be a limitation on the size of the call stack perhaps?

    I'm running under Linux (one of the most recent ones, can't recall the exact version number) with 250 MB of RAM and generally 190MB free at any moment.
      According to the perl manpage, "Recursion is of unlimited depth"

      Perhaps your problem is with the debugger? Perl will generate the warning when you get 100 levels deep if you have warnings turned on, but it shouldn't die on you. What happens when you run the script without the debugger?

      I agree that the problem isn't likely to be memory usage; you can run the program while watching the output from top sorted by memory to confirm that.

      Could you post some sample data so we can try to reproduce your error?

        Actually, I change my original answer. After studying the alogigrithm and my implementation and use (as well as taking your advice and running top), my resources get clobbered. After I get up around 3000K of free memory or so, the process get killed. I'll probably just need to reimplement the algorithm in an iterative fashion...hey this can be my new question to your gurus. How can I reimplement merlyn's algorithm that I link above in an iterative fashion? :)

        Thanks, Bamafan
      Usually when you bomb out of a deep recursion, you have reached your stack size limit. If you are on a Unixy system, use ulimit -s or ulimit -a to see your stack size limit. If you are lucky you can increase it yourself, if not you'll have to tweak the kernel-imposed limit.

      CU
      Robartes-