Depending on the type of data in your hash, and the size of it, you might be better off using a DBM module or a relational database for faster lookups (data::dumper requires you to read the entire structure in memory before you can use it or filter out the results you want to see)
By the way, a "couple of thousand" hash keys might not be a problem at all - depending on how fast you can generate your data and how many requests you get, the amount of memory in the server and CPU speed. If you have some time to experiment, try the "dumb but easy" approach first (i.e. just generate the hash and filter out the results, no files or databases etc), and see if it works before you start throwing all kinds of optimization techniques at it.
You might also be able to just generate the requested data, instead of all of it - but that really depends on the algorihm used to generate it.
In reply to Re: Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?
by Joost
in thread Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?
by EchoAngel
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |