in reply to Batch Google URL Removal Script

As a quick solution, wouldn't it work if you put up a robots.txt that asked robots away from all of those pages (you'd also take them down of course), then submit only one page in the google remove form, hoping that it would load the robots.txt and figure out to remove all other pages?

Replies are listed 'Best First'.
Re^2: Batch Google URL Removal Script
by BrowserUk (Patriarch) on Jan 22, 2010 at 22:28 UTC

    Isn't it better to leave the page, but with blank content rather than removing it? So that it gets overwriten in google's cache, rather frozen there in perpetutity.

    Many times, I've come across pages where the host no longer serves the url, but it's ghost is available from Google's cache for weeks after.

      You can do it either way. When you submit the form you tell google if you've edited or remove the page, so I assume it handles them differently.

      The idea is by submitting this form, you don't have to wait around for weeks because Google spiders the site sooner.