in reply to read a huge mysql Table into memory
But 10 queries per second is not much and MySQL should be able to keep up with that number of requests, assuming that the record ID you search on is indexed.
Another thing to consider: as you are speaking of web-pages, I assume your script will run on the web-server. Is it a CGI-script or does it run under mod_perl? If it is a simple CGI-script, you will have to build your array for every request and that will take ages. Under mod_perl you will only pay that start-up cost once, but I still think a SELECT into your data-table will be the best solution.
CountZero
A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: read a huge mysql Table into memory
by hiX0r (Acolyte) on May 30, 2009 at 18:01 UTC | |
by CountZero (Bishop) on May 31, 2009 at 06:32 UTC |