Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Re: Database searches with persistent results via CGI

by dsheroh (Monsignor)
on May 09, 2002 at 19:40 UTC ( [id://165476]=note: print w/replies, xml ) Need Help??


in reply to Database searches with persistent results via CGI

I can think of two options, offhand:

1) Since you're already going to be storing some session state (mapping the cookie to the temp table's name, if nothing else), just add the last search terms to the retained state.

2) Instead of saving the search terms with the session info, you could also store it with the temp table (maybe even using the search terms as part of the table name, with appropriate precautions).

#1 is more straightforward, #2 allows you to reuse the search results for anyone else who does that same search (such as all the monks looking for SMITH JOHN right now). Either way, though, you probably want to maintain this on the server side instead of relying on the client to tell you which set of saved search results to use, since the client can lie to you quite easily.

  • Comment on Re: Database searches with persistent results via CGI

Replies are listed 'Best First'.
Re: Re: Database searches with persistent results via CGI
by Anonymous Monk on May 10, 2002 at 13:50 UTC
    I don't see any reason to restrict your cache to a specific user. Isn't it likely that a day when one person searches for "Yasser Arafat" someone else will too? Maybe not, but if this is a feature of your application how about this solution? Every search for a name can bre recorded. The first step when performing a new search is to check if the cache helps. Periodically, a cron job removes the least recently used results. This will work better if what you cache is not the result of a complex query - which is less likely to be reusable - but the result of an atomic search, with a single match criterion. It will be like having a partial index into your database.
      This of course is great under certain circumstances. One master table to store the most recent searches, with a value of last searched time, replacing them in LRU fashion is a good solution for this. Each of the records in this table would then point to a results table that would be used as a cache. This would speed things wonderfully for the best case.

      However, if there is a high rate of insertion into the database, there needs to be a reasonably short timeout on the cache so that a search includes all or nearly all of the matches a fresh search would generate. There are ways around this, too, of course.

      If one keeps a table of recent insertions, such as the last 50, 500, 5000, or whatever number makes sense, then one can add those to the search easily enough if they match. One could also be more picky and redo the search if the search was done before the oldest of the rows in one's recent insertions table. This shouldn't require any changes to existing tables, and that's a Good Thing(tm).

      On a similar note, if one keeps with one's cached results table or with the table listing search strings the range over which the last search was made, and can prepare a new search to cover only any additional rows, then that is a major Win(sm). This also shouldn't require changes to existing tables.

      Christopher E. Stith
      Do not try to debug the program, for that is impossible. Instead, try only to relaize the truth. There is no bug. Then you will find that it is not the program that bends, but only the list of features.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://165476]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (8)
As of 2024-04-18 07:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found