in reply to Perl Programming Logic

You'll need to either get a better tracking system (other than a log), or you'll have to get creative.

For tracking, might I suggest first pulling out all instances of each user and dumping them into one group. Put all your 192.168.1.214's into one group, and all your other IPs into their own groups. Then, check all the URLs of each person. To make it easier, you could strip out requests for known places like ads.x10.com, seeing that that request is most likely a pop-up and not an intended request.

Creativity steps in here. You will have to assume that a user that requests /index.html /left.html /right.html in 5 seconds just loaded a frameset. Consider that 1 request. Now your user gets /index.html, then /blue.html, then /red.html, then /green.html over a 3 minute period. Four requests, that did not occur in a few-second time frame. The user should be considered "surfing" for those 3 minutes, because those requests are more than likely requests made by a human.

Now bringing relative time into the situation can cause some headaches, too. When I surf, I usually have 3-10 windows open. You will need to find some way to distinguish a clicked link from a cold request and a method to determine what kind of information is held on the requested page. You could do that with LWP, can read through the file to see if there's tags for Shockwave games, streaming video, large amounts of text (online books) and draw conclusions from those results and your request log.

It can be done, but it will be difficult without some other form of recording information. If you have the resources, you could set up a webproxy that monitors clicked links, and observe information that way. You could avoid Perl altogether, install BackOrifice on the machine you want to monitor and watch what your users are viewing.

Anonymonk, might I suggest that you create a user name and stay awhile. :)

John J Reiser
newrisedesigns.com