in reply to Re^5: Super search use DuckDuckGo link broken
in thread Super search use DuckDuckGo link broken

IIRC, something was added to specify that perlmonks.org or www.perlmonks.org is the official domain, so no surprise you get better result from that one.

  • Comment on Re^6: Super search use DuckDuckGo link broken

Replies are listed 'Best First'.
Re^7: Super search use DuckDuckGo link broken
by LanX (Saint) on May 04, 2025 at 10:38 UTC
      According to the robots.txt specifications I found at Google, it's possible to exclude "orthogonal" pages like &displaytype=print or ;displaytype=edithistory with wildcards.

      Any reason not to add

      • Disallow: /*displaytype=

      to the list? (Untested)

      Bing also suggests adding noindex-meta tags to the pages.

      On a tangent

      Ideally robots would be presented with a page without nodelets, but I'm not aware of an efficient solution, except checking the user-agent before building the page.

      Cheers Rolf
      (addicted to the Perl Programming Language :)
      see Wikisyntax for the Monastery

        These links are already tagged with rel="nofollow", so Google shouldn't be crawling these, nor should any other bots. Except a lot of them do, so I'm not sure if spending any effort on divining bot behaviours is time well spent.