in reply to newbie table structure

Ignoring whether this is the best way to log spider accesses, let's look at the database portion. You really don't want to have a table creation be part of the cgi logging access--you don't want to create the tables at every request! Create a separate 'installation' script that creates the tables once. Next, notice how all your tables look the same? This is a sure sign that there's a problem! Whether you want to call it refactoring or normalization, we need to avoid 'cut-n-paste' code. Consider that which agent/robot hits the cgi is just another piece of data, like so:

id int NOT NULL auto_increment, useragent char(32) NOT NULL, ip char(32) NOT NULL, date char(32) NOT NULL, useragent char(32), -- added field for ua primary key (id)

Now we have 1 table. In order to hold only the last 10 hits from each different user agent, we'll need to get a little clever (or stupid depending on your viewpoint) with our updates. First, I think we should add another column to track the order of the requests for each ua. (We could achieve the result using the date column, but I think it's clearer for the SQL uninitiated to add a column and avoid the date handling.)

... countdown int default(0), ...

Now when we add a new row, it will get a countdown value of 0. Every insert, we increment the countdown for that ua. Then, delete any entry with a countdown > 10.

UPDATE Table set countdown = countdown + 1 where useragent = ?; DELETE FROM Table WHERE countdown > 10;

That's a enough of a start for you!

--Solo

--
You said you wanted to be around when I made a mistake; well, this could be it, sweetheart.