it's not like i need to 'connect' to the database everytime i want to pull the links... i just have to run the SQL query...
It's considerably more expensive to ship a query over the wire, compile it, execute it, then marshal the results to ship back over the wire to an application that then needs to unmarshal those results than it is to open a local file and suck some bytes into memory. Perhaps, though, that performance difference isn't significant in your application.
A perfectly reasonable approach it to consider the likely upper bound on traffic that your site will need to support, then do the simplest thing that will satisfy that performance requirement. If executing a query (or several queries) per page view works for you, go for it.
| [reply] |
The prepare_cached method prevents the statement handle from being created more than once (in a persistent environment like mod_perl), but that's all it does. The query still has to be executed by the database. It should be faster to pull the data from a cache.
Maybe you're getting ahead of yourself here. Is the site slow now? Have you profiled it to find out where the bottlenecks are? It could be that this particular query runs so fast that it's not worth caching. | [reply] |
well, the site is not slow, i was just looking to the future, because though the site isnt finished,i just wanted to start with the fastest possible execution, avioding any bottleneck before it occurs.
and yes, i suppose i am getting ahead of myself, because the calls to the database are fairly simple SQL queries, so in all reality i doubt that they will pose a problem... i am just curious as to faster execution possibilities.
thanks for the wisdom fellow monks
| [reply] |