in reply to Re: Is (DOS-)Perl a Memory Hog?
in thread Is (DOS-)Perl a Memory Hog?

Please see my reply to tachyon for answers to some of your questions.

As a novice programmer, I don't understand fork emulation or, for that matter, much else about how Perl interacts with an operating system (such as MS-DOS). I was hoping to delay that part of my education, in the interests of getting some work done. (My Perl scripts "only" need to do what a simple DOS batch file could do--control which experiment runs next; and to do fancier "Perlish" things, like rewriting my code to change experiment timing conditions, and processing and integrating files, to get everything ready for statistical analysis.

Any insights would be appreciated.

Replies are listed 'Best First'.
Re: Re: Re: Is (DOS-)Perl a Memory Hog?
by Albannach (Monsignor) on May 11, 2004 at 01:41 UTC
    You've given me serious flashbacks to the days we spent hour upon hour with tweaking memory loading arrangements and rebooting over and over until some critical DOS app could run... QEMM... memmaker... *shudder*.

    I'd suggest you stick with batch files as much as you can, as they will give you the maximum memory possible, then call your various Perl scripts at the appropriate points in the batch files. They should still be able to read/write your DOS apps config files and input/output data files. I don't think you gain much by using a DOS shell from Perl, as the host script can't do anything until the shell exits. If you need to maintain state info, then have your scripts write variables into your own data files in order to pass information to subsequent invocations in the batch chain. There are lots of interesting batch tricks that might help, and if you are careful (and my memory serves) I believe you can even re-write the current batch file being executed as it is only run line-by-line.

    If you tell us more about your particular needs, some of us may be more helpful.

    --
    I'd like to be able to assign to an luser