Perl does trade off memory for speed in quite a few places so if you are looking for super memory efficiency, Perl is probably not the best choice. That being said:
- Depends on what you are doing with the tied hashes. You may want to look at using a full relational database like MySql or PgSql
- No more memory than using the same script once a day, or once a year. Once the script is finished running, it usually releases all the memory it used back to the OS (this is assuming you are using something like cron to call the script every 5 minutes, and not leaving it constantly running)
I haven't benchmarked it, but it's probably much faster to do $cnt = keys %hash;
- Common memory wasters
- Reading a file all into memory, instead of a line at a time
- large data structures all stored in memory instead of on disk with a tied hash or array
- Using large lists in a foreach, ie. foreach (1..10000) { }. This kind of relates to the point above.
- Probably more, but these
2 3 come to mind right off the bat, the first most often
You are always (or almost always) going to trade off memory for speed. You just have to decide which is more important.
Update: Benchmarked the part I hadn't benchmarked yet.
Update2: Added more memory wasters.