Just a quick one regarding your side question. As far as I know there is no limit to the size of the hash its simply a question of RAM.
Ive read and dedupped a UK MPS file (4 1/2 million records) with no problem on a machine with 2 GB ram. skywalkerIn reply to Re: Problem cycling through top level of nested hashes
by skywalker
in thread Problem cycling through top level of nested hashes
by tomdbs98
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |