in reply to Perl/CGI + MySQL: How many calls is too many?

Each site is different. Some selects may be really quick to do, while others (say against millions of entries), could take forever. It all depends on the site, your hardware config, and especially the data you are trying to get... have you looked to see where your bottleneck is? Is it on the database or on the webserver? Largely, this depends on what you're selecting. While your question is rather vague, here are a few things that you may want to look into: DBI placeholders are a good start, but you may want to look into using perl in a few places to cache your results. It'd also help to know what types of things you are trying to do. Reply to the post with more details, and we might be able to help you out more, but this is probably largely a mySQL question. Thanks.
  • Comment on Re: Perl/CGI + MySQL: How many calls is too many?

Replies are listed 'Best First'.
Re: Re: Perl/CGI + MySQL: How many calls is too many?
by drewbie (Chaplain) on Feb 27, 2002 at 23:18 UTC
    Cache your entries in perl, and use them elsewhere. You might be able to shave some cycles off by throwing your results into a hashref and keeping them around a little longer.

    Excellent point. You might also want to consider the Cache::* modules for your caching needs. They offer fixed size cacheing, which can help you keep from running out of RAM/disk. The file based caches are very fast, and might be ideal for that less often needed data.

    Also take a look at Perrin Harkin's excellent EToys article on perl.com for a great story on effective and intelligent cacheing.