in reply to How do I prevent more than one request (to some URL) per second with CGI?

IPC::Shareable and IPC::ShareLite looks like a possibility.

An example of what I meant in the first post is as follows. Take two users. Both of whom click on a link that gets XML data from Amazon.com at the exact same time. What happens when it goes to get the data? The script will make two calls to the Amazon servers at the same time which is more than the (1) request per second. This is what I am trying to prevent and why I was going to try the database route. I may be wrong though. Would the system actually do the above and run the same script within one second of each other? Or would there be enough time (1 second) between executions that I do not even need to worry about 2 or more requests occurring at the same time?

  • Comment on Re: How do I prevent more than one request (to some URL) per second with CGI?

Replies are listed 'Best First'.
Re: How do I prevent more than one request (to some URL) per second with CGI?
by cLive ;-) (Prior) on Nov 28, 2002 at 17:08 UTC
    In that case look at lockfiles - merlyn wrote a good article - "The poor man's load balancer" in the very late Web Techniques, which he has very kindly kept online here.

    .02

    cLive ;-)