in reply to Tailing the access log

I dunno on that solution, but if all you are trying to do is get a rough estimation of the number of served docs/sec, let me suggest an alternative method, easily doable in perl:

Grab the last n lines via tail from the access log and pipe it into a filehandle as you have it now (no -f, of course). 'n' would be based on how active you believe your server is; if you get 100,000 hits a day, on average, n should be around 100 (a hits per minute estimate would be N / 1440, N being the hits per day).

Grab the first time string on the date, and store it.

Subtract that date from localtime (using something like Date::Manip), to get a number of minutes, t_min.

n / t_min should be an accurate measure of the # of requests served per minute.

If you want to strictly limit it to docs, run through the n lines that you get back from tail, and increment a counter for each type of doc that you really want (eg look for /\.html/), call that counter x. x / t_min would be the number of html files served per minute then.

This should be sufficiently fast that you can run it every minute or so without any major slowdowns and get a good running stat on your averages.


Dr. Michael K. Neylon - mneylon-pm@masemware.com || "You've left the lens cap of your mind on again, Pinky" - The Brain