in reply to Repetitive File I/O vs. Memory Caching

It scares me to think that with your current implementation, there is nothing preventing the cache from growing to the total size of your website (every page will become cached eventually). Multiply that by the potential for multiple instances of the same caches, and you could end up with a potentially huge memory footprint. This is not a scalable approach. Care and caution could be exercised such that your cache will never grow beyond a predetermined size... a smarter caching method. ...and at the same time, you could find a way of dealing with multiple instances.

I would not be surprised if caching could be used to gain some performance improvements. But I have to wonder if the caching should, perhaps, be a layer built onto the database side of things rather than the CGI script. Let a layer between the database and the CGI script deal with the nitty gritty task of calculating which pages are most popular, and limiting the cache size. Doing it this way could make it easier to deal with multiple instances also, since there need only be one instance of the database caching layer.


Dave

  • Comment on Re: Repetitive File I/O vs. Memory Caching