in reply to How do I prevent more than one request (to some URL) per second with CGI?

You could relatively easily do that by wrapping a HTTP::Daemon around a LWP::RobotUA (using $robot->delay(1/60);) to create a proxy that only serves one request per second per host. Be sure to reject any URLs to any other servers than the one you want, and preferrably to have it listen only on localhost. Then just use LWP::UserAgent as usual, with localhost and the appropriate port set as proxy. (You will need a fair amount of proficience with the HTTP protocol to get it all right of course but that should be by far the fastest to get going approach.)

Makeshifts last the longest.

  • Comment on Re: How do I prevent more than one request (to some URL) per second with CGI?