We are wanting to supplement Google Analytics or a few reasons. Not least because we want to have site traffic information held in our own database so we can interrogate it automagically. We've created a database table to hold this data.

Within the common header method, we've added some code that sets a cookie with a max age of 2 hours or refreshes the cookie if it is already set. If the cookie isn't already there, we write a row to the database table with the entry time, entry page, etc. If the cookie exists we update the row with exit page, exit time and bump the page count.

This approach is working and it's been running for a week.

But, it is reading about 11 times higher for site traffic than Google Analytics. I'd expect some discrepancy but not that much. Looking at the visits, we are getting a quite a few with the same or very close timestamp so my best guess is that it's a client that isn't accepting the cookie - perhaps a web crawler. To check this out, I've added IP and User Agent to the database table and sure enough these have a user agent of a crawler/bot.

To solve this, I've added a condition to the line that writes the new line to the database:

$dbh->do("INSERT INTO Site_Visit SET firstVisit = NOW(), lastPage = ?, + firstPage = ?, IP = ?, userAgent = ?, orsa = ?, orta = ?, Person_idP +erson = ?", undef, $ENV{'REQUEST_URI'}, $ENV{'REQUEST_URI'}, $ENV{'REMOTE_ADDR' +}, $ENV{'HTTP_USER_AGENT'}, $cookie{'orsa'}, $data{'orta'}, $user) unless $ENV{'HTTP_USER_AGENT'} =~ /bot/i or $ENV{'HTTP_USER_AGEN +T'} =~ /facebook/i or $ENV{'HTTP_USER_AGENT'} =~ /dataprovider/i;
This seems to be working...but...the list of 'blocked' user agent strings could get quite large.

Is there a more Perlish way to write this condition?

I did think of putting them all in a database table for querying the user string against this table:

SELECT ? IN ( SELECT userAgent FROM Blocked_Users )
untested

But, that would mean having the full and exact user agent strings instead of using a regexp.

Note that I don't want to block crawlers, I just don't want them written to the site visit logs. This makes it quite difficult to Google because most articles are about blocking crawlers and bots from a website.


In reply to Bot vs human User Agent strings by Bod

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post, it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, details, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, summary, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.