in reply to Re^4: blocking site scrapers
in thread blocking site scrapers

/me humbly searches through his own httpd.conf and finds

SetEnvIf Request_URI "winnt/system32/cmd\.exe" worm # etc ... CustomLog "|exec sh" "/sbin/route -nq add -host %a 127.0.0.1 -blackhol +e" env=worm

... so I guess to answer your question, the answer is that nothing is wrong with it per se. This was a somewhat popular method to block nimda, code red, sadmind, etc from doing too much damage to web servers a few years ago. More can be read here: log monitors and here: securityfocus. These links even suggest that indeed local or upstream firewalling would be more efficient.



--chargrill
$/ = q#(\w)# ; sub sig { print scalar reverse join ' ', @_ } + sig map { s$\$/\$/$\$2\$1$g && $_ } split( ' ', ",erckha rlPe erthnoa stJu +" );