in reply to Re: Re: Re: Web Robot
in thread Web Robot

On the topic of robots.txt, why would someone even use this? If you don't want a page accessed, limit access to it. Depending on all computers to play nice isn't a very smart move, they have many hidden motives :)

Replies are listed 'Best First'.
Re^5: Web Robot
by schumi (Hermit) on Jul 17, 2003 at 09:45 UTC
    Quite true, although most major search engines do actually heed the robots-file, if it is setup properly.

    I think the easiest way to restrict access to a directory is setting up a proper .htaccess-file. You could even restrict access by IP-addresses...

    On the other hand, using a robots-file (in addition to the above, note!) decreases the amount of 404s in your error-logs... ;-)

    --cs

    There are nights when the wolves are silent and only the moon howls. - George Carlin