in reply to Slurping BIG files into Hashes

This does seem a little slow, but it might be down to your hardware.

I perform a similar operation, but on a properly big file: 27 million records, 8 character keys and values.
That only takes about 20 minutes, using exactly the same method as you - substr()'ing out the keys and values from a flat file and stuffing them into a hash.

I can't think of a better way of doing it - unpack is unlikely to be much faster, but you might want to do some benchmarking.


If the information in this post is inaccurate, or just plain wrong, don't just downvote - please post explaining what's wrong.
That way everyone learns.