If you need a bigger answer you need to paint a bigger picture. There are many ways to speed up searches, and they all depend on what you are searching for, how big the hay stack is, what you are going to do when you find each item and how many items there are. Good techniques also depend on whether the file is sorted, how big the keys are, how much ancillary data there is, is the file across a network, on local disk or in memory, and probably a slew of other thing.
The best thing you can do is describe the bigger picture - 20,000 records are not very many so most likely there is something in your bigger process that you are doing inefficiently. It's unlikely to be a Perl construct, it's much more likely to be of the nature of needlessly nested loops (quite possibly the grep you mentioned) or something of that nature.
We have a sample of your data. Now describe what you need to do with it.
In reply to Re: Needed Performance improvement in reading and fetching from a file
by GrandFather
in thread Needed Performance improvement in reading and fetching from a file
by harishnuti
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |