mtmcc has asked for the wisdom of the Perl Monks concerning the following question:
Worthy Monks,
I have a few perl scripts that manipulate some text in large text files (~20,000,000 lines), using while (<FH>) loops to run through the file. Apart from a few small arrays and hashes, I use very little memory.
The programs work just as expected both on Ubuntu 12.04 (Perl 5.14.2), and OSX mountain lion (Perl 5.12.4).
However, in OSX, the available 'free' memory is quickly converted to 'inactive' memory. The system monitor on Ubuntu shows pretty much flat memory usage.
The only downside of this, as far as I can see, is that in OSX, other programs running simultaneously are slower to respond. My script seems to be unaffected.
I was wondering if anybody else has noticed this, and is there an obvious general explanation that I'm missing?
Thank you for any opinions you might have.PS As an example, used on a text file with 20,000,000 lines:
on Ubuntu, it flies through, and on OSX, the memory thing happens.#!/usr/bin/perl use strict; my $fileForLineCount = $ARGV[0]; my $lineCount = 0; my $buffer = 0; open (my $targetHandle, "<", $fileForLineCount); while(sysread $targetHandle, $buffer, 4096) { $lineCount += ($buffer =~ tr/\n//); } close $targetHandle; print STDERR "$lineCount\n\n";
Thanks for your help!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Inactive memory and OSX
by BrowserUk (Patriarch) on Jul 04, 2013 at 22:01 UTC | |
|
Re: Inactive memory and OSX
by dave_the_m (Monsignor) on Jul 04, 2013 at 21:41 UTC | |
by Anonymous Monk on Jul 05, 2013 at 09:10 UTC | |
|
Re: Inactive memory and OSX
by mtmcc (Hermit) on Jul 05, 2013 at 07:48 UTC | |
by BrowserUk (Patriarch) on Jul 05, 2013 at 08:24 UTC | |
by mtmcc (Hermit) on Jul 05, 2013 at 09:29 UTC | |
by BrowserUk (Patriarch) on Jul 05, 2013 at 09:51 UTC |