"be consistent" | |
PerlMonks |
Re^3: Get unique fields from fileby LanX (Saint) |
on Jan 08, 2022 at 15:27 UTC ( [id://11140269]=note: print w/replies, xml ) | Need Help?? |
> Depending upon the data of course, your HoH (hash of hash) structure could consume quite a bit more memory than the actual file size in MB. This shouldn't be a problem if you a apply a sliding window technique° plus splitting the hashes into easily swappable chunks². The trick is to balance time, space and disk access, by minimizing the the number of swaps. This will scale well, until the limit given by disk-space.
Cheers Rolf °) see ²) see
In Section
Seekers of Perl Wisdom
|
|