in reply to Re^2: speeding up row by row lookup in a large db
in thread speeding up row by row lookup in a large db

Well, in your question you said the whole database was 430MB, so you can see why I would suggest loading it into RAM. Perl should be able to access more than 2GB RAM on a 64-bit machine, and to some extent on 32-bit one if you have the right Linux kernel.

INSERTs will definitely run faster if you only commit every 1000. There may be some other SQLite tuning tricks, which you'd probably find on a mailing list or wiki devoted to SQLite. But if none of those work for you, I think MySQL is your best bet.

  • Comment on Re^3: speeding up row by row lookup in a large db