in reply to Re: Memory usage by perl application
in thread Memory usage by perl application

Hi Corion,

I am doing this just as a exercise to check memory usage by the system. Please find below my code snippet,
open FILE,$input_file or die $!; my @data; while(<FILE>){ push(@data,$_); } close FILE;
The file i am reading is 2 Gb in size.
In this scenario, the total memory usage should be around 2Gb or something about 4 Gb. My system shows about 3.8 Gb of memory usage.
Does variable FILE also stores the full file in memory and again my array is having the full file.

Is this acceptable. Let me know your inputs.

Thanks,
Manu

Replies are listed 'Best First'.
Re^3: Memory usage by perl application
by Corion (Patriarch) on Dec 22, 2010 at 18:01 UTC

    In your first post, you talked about a hash. I see no hash in your code.

    See illguts, again. It talks about the underlying data structures and their memory needs.

    Depending on how large each line in $input_file is, Perl will, again, use up to 16 times the memory (based on the calculation that each line is one character long, and takes an overhead of 16 bytes, disregarding the SvPV entry and overhead of the array itself).

    This behaviour is acceptable to me. There are very few reasons to read a file completely into an array. If you really need to handle generic large data structures, most likely a database like Postgres or SQLite will suit your needs far better than storing the data through Perl can.