in reply to Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?

If your data needs to be constructed fresh for every request, it doesn't make sense to store it in a file and then read it back - in fact, it would only make the whole system slower. If OTOH it's only created every 5 minutes or so, I would suggest running a script via cron to create the data.

Depending on the type of data in your hash, and the size of it, you might be better off using a DBM module or a relational database for faster lookups (data::dumper requires you to read the entire structure in memory before you can use it or filter out the results you want to see)

By the way, a "couple of thousand" hash keys might not be a problem at all - depending on how fast you can generate your data and how many requests you get, the amount of memory in the server and CPU speed. If you have some time to experiment, try the "dumb but easy" approach first (i.e. just generate the hash and filter out the results, no files or databases etc), and see if it works before you start throwing all kinds of optimization techniques at it.

You might also be able to just generate the requested data, instead of all of it - but that really depends on the algorihm used to generate it.

  • Comment on Re: Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?
  • Download Code