Well, that certainly makes more sense than say, dynamically altering firewall rules (yes, I've seen that). :)
A well behaved search engine bot SHOULD be discernable by their UA (doubt the script kiddies bother to change theirs), and you may want to note whether a client requests or has requested /robots.txt...
Granted, none of this is a sure thing, but a combination of "tests" may get you close enough to what you want without restricting others...
$/ = q#(\w)# ; sub sig { print scalar reverse join ' ', @_ } + sig map { s$\$/\$/$\$2\$1$g && $_ } split( ' ', ",erckha rlPe erthnoa stJu +" );
In reply to Re^3: blocking site scrapers
by chargrill
in thread blocking site scrapers
by Anonymous Monk
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |