Welcome to the Monastery | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
>
push @p, [ $p1, $p2];
I guess that creating 500e06 sub-arrays is the most memory consuming part. I can't tell how efficient the usage of @d is, because it depends on how sparse the entries are used. 32 bits means 4.2e9 potential entries but with 500e6/256 = 1953125 in worst case you'll end up with 2200 empty entry ratio in @d. I bet a hash is more memory efficient then. But it really depends what you are trying to achieve with all that data.
update> which don't occur more than 256 times each. your "test" doesn't reflect that.
Cheers Rolf In reply to Re: Memory efficient way to deal with really large arrays?
by LanX
|
|