in reply to System out of memory while storing huge amount of data
Sometimes using a simpler data structure is very effective in conserving memory. For example, creating an array containing 1 million fairly simple hashes:
perl -e" my @a; push @a, { name=>'fred', surname=>'bloggs', age=>'ancient', dob=>'0/0/0000', empno=>1234567890 } for 1..1e6; sleep 10"
uses 673MB of ram on my 64-bit system.
However, storing exactly the same information using strings:
perl -e" my @a; push @a, join( $;, name=>'fred', surname=>'bloggs', age=>'ancient', dob=>'0/0/0000', empno=>1234567890 ), 1..1e6; sleep 10"
Only takes 87MB.
By using strings--which are easily turned back into hashes on a case by case basis as required: my %temp = split $;, $array{ $i ];--rather than hashes, during the data accumulation phase, can often mean that you can store 6 or 7 times as much data in the same memory.
Whilst there may be some performance penalty incurred as a result of building the strings then converting them back to hashes on demand, this is often far less than the performance hit of moving to disk-based storage.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: System out of memory while storing huge amount of data
by Marshall (Canon) on Oct 28, 2010 at 15:21 UTC | |
by BrowserUk (Patriarch) on Oct 28, 2010 at 15:39 UTC |