in reply to How do I prevent more than one request (to some URL) per second with CGI?

"No more than 1 request/second" can be sorta tricky.

No more than 1 GET request? Do images on a page count?

By the way, if you write your application to send back a redirect URL (HTTP code 302, I believe?), you can "bypass" the limit -- because the request is coming from the users' browser.

It sounds like you're grabbing some info and processing before you spit it back out to the user. Have you thought about caching data so you don't have to do a request every time?

<-> In general, we find that those who disparage a given operating system, language, or philosophy have never had to use it in practice. <->
  • Comment on Re: How do I prevent more than one request (to some URL) per second with CGI?

Replies are listed 'Best First'.
Re^2: How do I prevent more than one request (to some URL) per second with CGI?
by Aristotle (Chancellor) on Nov 28, 2002 at 14:06 UTC
    Images are unlikely to be of interest when fetching some data to a script. A redirect is obviously useless since it doesn't get the data to the script. The cache must be synchronized not to update faster than once per second also, so whether you're using a cache or not does not automatically ensure strict compliance to the demand under any circumstance. If you already have some sort of serializing mechanism though, adding caching to it is a an excellent proposal.

    Makeshifts last the longest.