Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Re: Re: Database searches with persistent results via CGI

by Anonymous Monk
on May 10, 2002 at 13:50 UTC ( [id://165635]=note: print w/replies, xml ) Need Help??


in reply to Re: Database searches with persistent results via CGI
in thread Database searches with persistent results via CGI

I don't see any reason to restrict your cache to a specific user. Isn't it likely that a day when one person searches for "Yasser Arafat" someone else will too? Maybe not, but if this is a feature of your application how about this solution? Every search for a name can bre recorded. The first step when performing a new search is to check if the cache helps. Periodically, a cron job removes the least recently used results. This will work better if what you cache is not the result of a complex query - which is less likely to be reusable - but the result of an atomic search, with a single match criterion. It will be like having a partial index into your database.

Replies are listed 'Best First'.
Re: Re: Re: Database searches with persistent results via CGI
by mr_mischief (Monsignor) on May 10, 2002 at 18:26 UTC
    This of course is great under certain circumstances. One master table to store the most recent searches, with a value of last searched time, replacing them in LRU fashion is a good solution for this. Each of the records in this table would then point to a results table that would be used as a cache. This would speed things wonderfully for the best case.

    However, if there is a high rate of insertion into the database, there needs to be a reasonably short timeout on the cache so that a search includes all or nearly all of the matches a fresh search would generate. There are ways around this, too, of course.

    If one keeps a table of recent insertions, such as the last 50, 500, 5000, or whatever number makes sense, then one can add those to the search easily enough if they match. One could also be more picky and redo the search if the search was done before the oldest of the rows in one's recent insertions table. This shouldn't require any changes to existing tables, and that's a Good Thing(tm).

    On a similar note, if one keeps with one's cached results table or with the table listing search strings the range over which the last search was made, and can prepare a new search to cover only any additional rows, then that is a major Win(sm). This also shouldn't require changes to existing tables.

    Christopher E. Stith
    Do not try to debug the program, for that is impossible. Instead, try only to relaize the truth. There is no bug. Then you will find that it is not the program that bends, but only the list of features.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://165635]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others rifling through the Monastery: (5)
As of 2024-03-29 13:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found