I read a bit about the query cache in mysql, but these tables are constantly updated. Anywhere from 10-16 million records a day, so if someone issued the same query again, it may pull up older data instead of the most current.
I will check out DBM::Deep. I suppose it could go either way. Send more parameters to the database so it sends the same query everytime, or use a perl mod.
Comment on Re^2: Best way to cache large amount of data?
Well, if the intent is for a user to be able to sort the same data they just saw, then it makes sense to be able to see the same data each time. Another possilibity is to use something like DBM::Deep to pull the data you want to show, then build your view from that intermediate cache.
My criteria for good software:
Does it work?
Can someone else come in, make a change, and be reasonably certain no bugs were introduced?