EchoAngel has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks, I am new to cgi. I have this perl package that will generate a large amount of information. As such, I was thinking of building two cgi scripts where the first cgi script creates the hash, then data dump this hash into a file, then the second cgi script would read in the hash and do whatever the user wants to view. As such, the user would just view the second cgi script. Do you think this would be helpful? I want this to go as fast as possible. To me, cgi is essentially a script. I think making one cgi script (generate hash and create html) is too much time. Note : the hash generated is HUGE, couple thousand lines. Do you think I shouldn't use perl/cgi?
  • Comment on Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?

Replies are listed 'Best First'.
Re: Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?
by Joost (Canon) on Dec 24, 2004 at 20:57 UTC
    If your data needs to be constructed fresh for every request, it doesn't make sense to store it in a file and then read it back - in fact, it would only make the whole system slower. If OTOH it's only created every 5 minutes or so, I would suggest running a script via cron to create the data.

    Depending on the type of data in your hash, and the size of it, you might be better off using a DBM module or a relational database for faster lookups (data::dumper requires you to read the entire structure in memory before you can use it or filter out the results you want to see)

    By the way, a "couple of thousand" hash keys might not be a problem at all - depending on how fast you can generate your data and how many requests you get, the amount of memory in the server and CPU speed. If you have some time to experiment, try the "dumb but easy" approach first (i.e. just generate the hash and filter out the results, no files or databases etc), and see if it works before you start throwing all kinds of optimization techniques at it.

    You might also be able to just generate the requested data, instead of all of it - but that really depends on the algorihm used to generate it.

Re: Perl/Cgi : Suggestions on how to pass Large Hash Data to myscript.cgi?
by Jaap (Curate) on Dec 24, 2004 at 20:33 UTC
    Any form of caching will make your program fast.
    I don't know if Freezing/Dumping a hash to a file is the best option, but it is very simple and will probably work so why not. You could also use mod_perl (calculate on startup, requests read it from mem), smart use of a database (save your results so it) or saving the data to an XML file.