This computer, with its meager 40MB, is set up for running custom psychophysiology software under DOS, so that it can assume complete control over the processor (for timing-critical stimulus/monitoring control). How much memory it uses depends on the particular experiment programmed and what images and sounds it preloads. I'm sure it can get by sharing with Perl, if Perl will share with it.
My script for trying to run the other program is this:
#!/perl/bin/perl -w
use strict;
my @args = ("STIM2", "/SESSION=none", "/ORDER=bx-err1", "/WARN=1");
system (@args) == 0
or die "System call failed: $?";
But the script I used to check Perl's memory hogging (where it showed all extended memory was used up) was this:
#!/perl/bin/perl -w
use strict;
system ("MEM") == 0
or die "System call failed: $?";
If there's a way of telling Perl to only take what it needs, I'd love to know. Thanks.
In reply to Re: Re: Is (DOS-)Perl a Memory Hog?
by Neuroactive
in thread Is (DOS-)Perl a Memory Hog?
by Neuroactive
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |