in reply to Out of memory Error
I've had a similar problem, I was only reading a 2 meg file, and was using over 200 megs of ram; but I was doing much more with it.
First, let me say, using hashes takes some extra space over arrays, and you should be aware of that going in.
Second, the more modifications to data, the more Perl "e;Meta"e; data is required. From what I understand, and please correct me if I'm wrong, this meta data speeds up incremental modifications on your hashes and arrays.
In short, it is not out of bounds to get a lot of memory usage with perl, with big, or small datasets, depending on how much you are doing with them.
Perhaps there is a better way of splitting up the problem (maybe writing the data out to two or more files as you read based on the employee so that you have smaller datasets to work on.)
--
Kwyjibo. A big, dumb, balding North American ape. With no chin.