Re: Is (DOS-)Perl a Memory Hog?
by tachyon (Chancellor) on May 10, 2004 at 23:38 UTC
|
If Perl is using up all of my extended memory, is there any way of telling it not to be so greedy?
No and yes. Perl does use a lot of memory as it has made the design compromise to spend memory and gain speed. However Perl uses precicely as much memory as you ask it to. If you post your code it may well be simple to optimise it to use less memory.
Also by DOS you really mean DOS or do you mean the DOS window you can get from Windows? Do you really only have 40MB? How much memory does your DOS based program use running all by its lonesome? Perl will 'typically' use circa 2-5MB for fairly trivial stuff and ramps up from there. If your DOS process needs almost all the memory you have you may be flat out of luck.
| [reply] |
|
|
Yes, I really mean MS-DOS not operating within Windows.
This computer, with its meager 40MB, is set up for running custom psychophysiology software under DOS, so that it can assume complete control over the processor (for timing-critical stimulus/monitoring control). How much memory it uses depends on the particular experiment programmed and what images and sounds it preloads. I'm sure it can get by sharing with Perl, if Perl will share with it.
My script for trying to run the other program is this:
#!/perl/bin/perl -w
use strict;
my @args = ("STIM2", "/SESSION=none", "/ORDER=bx-err1", "/WARN=1");
system (@args) == 0
or die "System call failed: $?";
But the script I used to check Perl's memory hogging (where it showed all extended memory was used up) was this:
#!/perl/bin/perl -w
use strict;
system ("MEM") == 0
or die "System call failed: $?";
If there's a way of telling Perl to only take what it needs, I'd love to know.
Thanks.
| [reply] |
|
|
| [reply] |
|
|
|
|
rem start.bat, save in path then just type start
@echo off
stim2 /session=none /order=bx-err1 /warn=1
Command line args can be passed and appear in %1, %2, %3 etc like $ARGV[0], $ARGV[1], $ARGV[2].....
See This batch file tutorial
| [reply] [d/l] [select] |
Re: Is (DOS-)Perl a Memory Hog?
by Zaxo (Archbishop) on May 10, 2004 at 23:41 UTC
|
"512k should be enough for anyone!" ;-)
Perl *is* a memory hog relative to old MSDOS systems. Every program is written with tradeoffs of memory vs. speed. Perl is historicly lavish about memory, to get speed. You have more memory than most such systems, but Perl
can easily use it all.
Does your script keep big hashes or arrays around? Try to pare away large global memory structures. You can work memory vs. speed, too. Favor looping constructs, line processing, small lexical data structures in tight scope.
Update: Also, MSDOS is unlikely to have very good fork emulation. A system call may be beyond the OS's capacity.
| [reply] |
|
|
Please see my reply to tachyon for answers to some of your questions.
As a novice programmer, I don't understand fork emulation or, for that matter, much else about how Perl interacts with an operating system (such as MS-DOS). I was hoping to delay that part of my education, in the interests of getting some work done. (My Perl scripts "only" need to do what a simple DOS batch file could do--control which experiment runs next; and to do fancier "Perlish" things, like rewriting my code to change experiment timing conditions, and processing and integrating files, to get everything ready for statistical analysis.
Any insights would be appreciated.
| [reply] |
|
|
You've given me serious flashbacks to the days we spent hour upon hour with tweaking memory loading arrangements and rebooting over and over until some critical DOS app could run... QEMM... memmaker... *shudder*.
I'd suggest you stick with batch files as much as you can, as they will give you the maximum memory possible, then call your various Perl scripts at the appropriate points in the batch files. They should still be able to read/write your DOS apps config files and input/output data files. I don't think you gain much by using a DOS shell from Perl, as the host script can't do anything until the shell exits. If you need to maintain state info, then have your scripts write variables into your own data files in order to pass information to subsequent invocations in the batch chain. There are lots of interesting batch tricks that might help, and if you are careful (and my memory serves) I believe you can even re-write the current batch file being executed as it is only run line-by-line.
If you tell us more about your particular needs, some of us may be more helpful.
--
I'd like to be able to assign to an luser
| [reply] |
Re: Is (DOS-)Perl a Memory Hog?
by BrowserUk (Patriarch) on May 11, 2004 at 01:46 UTC
|
I've never used perl under MS-DOS, and I haven't used MS-DOS for 10+ years, but I do vaguely remember a similar situation with an early DOS port of REXX that effectively blocked access to extended memory. I don't recall whether that was EMS or XMS (or even what the difference is though I knew once).
I suspect that the problem is not that Perl is using all the extended memory, but rather that it is trampling on something that effectively blocks access to it.
I tried to think of some way to verify this suspicion, I even dug out a couple of old MS-DOS internals books that I keep for "thems's was the days" purposes, but nothing leaped off the index or contents pages that looked like it would help:(
Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
| [reply] |
|
|
| [reply] |