in reply to Improving efficiency - repeated script hits at small interval

This node falls below the community's minimum standard of quality and will not be displayed.
  • Comment on Re: Improving efficiency - repeated script hits at small interval

Replies are listed 'Best First'.
Re: Re: Improving efficiency - repeated script hits at small interval
by shambright (Beadle) on Jun 27, 2002 at 17:39 UTC
    I guess I don't understand your initial question. Why does a user have to have a unique process if the output data for all users "is very similar"? Could you use a cookie to identify a user who will be using this script? Perhaps you could elaborate more on the problem.

    Using mod_perl has some RAMifications (pardon the pun), but the advantage is the large speed increase from not having to compile the script or re-open a database connection for each call. Any data stored in local (not global) variables is freed at the end of the script run, so I don't see it as a big deal.

    UPDATE: Ok... so you did elaborate, sorry about that.

    If you use a good backend database to keep up with these minute changes you are describing, your SQL calls are on the order of thousands of a second. I would think this is okay.

    As far as the actual presentation of data: is there really an advantage to "thawing out" and printing a session versus rebuilding and printing it from scratch? If your backend database is quick, printing out a template with the newer data should work just fine.

    Honestly, I don't have a ton (umm.. or any) experience with using sessions. So if there is a stronger argument for why using a session is better, please explain why/how. Thanks.