in reply to Out of memory

You'll double the size of file you can load by using this:

my $data; do {local $/; $data = <FILE> };

Instead of:

my $data = do {local $/; <FILE> };

And probably, though I haven't tested it recently, faster by doing:

my $size = -s( $filename ); open my $fh, '<', $filename or die $!; read( $fh, my $data, $size );

But if you're being limited to a few hundred MB, on a system with many GB available; Corion's right. Either your process is a 32-bit Perl; or it is being subject to a ulimit; or both.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority". I'm with torvalds on this
In the absence of evidence, opinion is indistinguishable from prejudice. Agile (and TDD) debunked

Replies are listed 'Best First'.
Re^2: Out of memory
by Anonymous Monk on Jun 09, 2015 at 17:05 UTC

    You'll double the size of file you can load
    This was probably an issue before 5.20 and should no longer matter with recent versions.