Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I've added some simple form scripts to our website (e.g. 6 field comment form) and have included the "remote IP" as a hidden field.

Each night at about the same time the hosting server (a known internal IP) must be running a web harvesting engine for the website search engine (or something similar) since each morning I recieve empty forms from the same server IP address.

Would making one or more fields mandatory prevent this from happening, or are there other ways to prevent robots, etc. from triggering simple form scripts?

Replies are listed 'Best First'.
Re: robots and perl...
by arhuman (Vicar) on Dec 17, 2001 at 20:36 UTC
    use a 'robots.txt' file (for parts of your site)
    or
    the META html tag (for one page): <META name="ROBOTS" content="NOINDEX, NOFOLLOW">
    (see this page for an example/explanation)

    UPDATE : This page seems to be more informative.
    But as always use your search engine to find THE one that fits you...

    "Only Bad Coders Code Badly In Perl" (OBC2BIP)
Re: robots and perl...
by cfreak (Chaplain) on Dec 17, 2001 at 20:39 UTC
    Not really a perl solution but if the robot that's causing you trouble is created correctly then it should follow the robot rules. You can setup a robots.txt file on your machine. It would look something like this:
    User-agent: * Disallow: /path/to/your/script

    If someone was bad and the robot ignores these rules then making a field required would keep you from getting the output but won't keep the robot from running the script.

    Chris