in reply to How to improve MYSQL search performance of perl?
The number of rows a query might consider, now, that's an important measurement. The size of a row, both in columns (used in the query) and the total size in bytes are important as well, but much less so.
Having said that, 500 Mb is tiny by modern standards. Most desktops, and even many laptops will be able to keep almost the entire database in core memory - if you have a dedicated machine for your database (and you should), put in 1 Gb of RAM, and you'll be sure you have the entire database in memory.
But even if you have that, your approach can still be "slow". Whether or not that is significantly improveable depends almost entirely on your database structure (tables, indices), and the queries performed. If the queries could be almost anything, there will be many queries that will not be able to make use of the given indices, resulting to table scans. And even with the entire database in core memory, having to do many table scans will slow down things.
But as others said, this is mostly a database question. Consult your local database administrator/guru.
|
|---|