in reply to Making perlmonks seach engine friendly

Totally great technique. Except that if we can't get vroom to do it here, all the real perlmonks.org links will still clutter any search engine result listing. :)

How hard would it be to simply redirect user agents that don't look like spiders/bots to the real site? I notice in my logs that the well behaved spiders ask for robots.txt first, so based on that you could allow only those agents that ask for robots.txt first to access these clean, better-for-indexing pages.

Just out of curiosity... does your CGI script simply use an LWP request to perlmonks.org to create the content on the fly... I assume it must since to do otherwise would be a constant update job. Although I also assume for efficiency it would cache any pages it has processed at least once. Something like this would also be a great start on that CD-ROM version of the site. ;)
  • Comment on (ichimunki) Re: Making perlmonks seach engine friendly

Replies are listed 'Best First'.
Re: Re: Making perlmonks seach engine friendly
by blakem (Monsignor) on Aug 20, 2001 at 11:27 UTC
    Yep, it keeps a cache for a few days; if the cache misses, fresh content is fetched from perlmonks.org.

    I don't really want to play the cat-n-mouse game of User-Agent matching. However, I did remap a few more buttons to fall through to the real site. 'Offer your reply', '\d+ replies', 'comment on' and 'perl monks user search' all send you to perlmonks.

    Even w/o the big "Go to the real perlmonks" sign at the top, it wouldn't take too many clicks before a real user would wind up on the main site.

    -Blake