The only reasonable approach in this case - keep all data in memory (or shared memory at least), as there are 20-40 requests per second from users. And if each of these requests will make 10-30 requests to some db - even if this will work, i suppose it will not be very stable or scalable.
I think you'll be surprised. Databases are very good at this sort of think. Benchmark, don't guess ;-)
In reply to Re^3: Saving big blessed hashes to disk
by adrianh
in thread Saving big blessed hashes to disk
by b888
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |