In terms of the size of a structure in Perl, there's a few separate considerations.
the hash bins themselves; this grows automatically to try to keep the collision rate low, but I don't think it ever shrinks, even if pairs are deleted. On each growth, all the keys are rehashed internally, similar to pouring all the water from a small bucket to a larger bucket. You might try to reverse the process by manually pouring a half-empty bucket back into a smaller bucket occasionally: it might be as simple as doing an %hash = (%hash) if (eval scalar %hash) < 0.1; but I haven't tested this.
the elements themselves. Is your hash storing references to large structures? Are these really reaped when you remove them from the hash, or are you forgetting some other variable that's also keeping these references alive?
Perl never hands memory back to the operating system. It's not Perl's fault, it's the way memory works in most modern operating systems. When Perl reaps an object, the memory is free for Perl to use again, but not free for Excel or XFree to use until Perl exits.
A 32bit machine has certain limitations about how much memory one process can reach: 4GB is the numerical limit, but it's usually cut to 2GB-3.5GB depending on kernel or operating system overhead and design. If you want to increase that, you'll have to go to a 64bit OS, or get creative about persistent disk-backed storage and auto-restoration for active entities.
Most large-data algorithms can be rewritten as a stream-data algorithm instead. Do you need to keep all those entries for so long? Sometimes you do, and oftentimes, you don't. Rethink the core goals of your algorithm and see if there's another way.