in reply to List overhead

Hashes, by their very nature, are often full of empty pointers, each of which is bound to be at least 4 bytes on 32-bit platforms. If your elements are only 30 bytes each, then a large amount of this memory is probably sitting in unused hash-slots. There's nothing much you can do about this, but if you are running short on RAM, you can use a "tied" hash, as described below.

Even with lists, the Perl interpreter must store information besides the data itself, such as the length of the scalar, reference counts, and so forth, which normally doesn't add up to a whole lot. I would expect that a minimum of 12-16 bytes to be allocated per scalar just for this kind of internal information. Of course, with several million tiny scalars being allocated, this can add up to a lot of data.

In compiled languages like C, when you ask for an array of 1,000,000 30-byte strings, that's what you get, usually as a big continuous chunk of RAM. With Perl, unless you want to do something really crazy, like pack your strings into one giant scalar and use substr to extract them, you have to live with the overhead. Usually it's not so bad, but it might come as a bit of a surprise.

Since you have about 30 MB of real data, you are experiencing about a 1:3 expansion when loaded into RAM. If this is a problem, you can use a "tied" hash which uses far less RAM, but is disk-based and a fair bit slower. Still, tied hashes and regular hashes work the same way. It only requires you to add a few lines to tie the hash, and the rest of your code can stay the same.