in reply to Re^4: Perlmonks site has become far too slow
in thread Perlmonks site has become far too slow

That sounds like it could be viable. But, help me understand: what is meant by "dynamic links"?

One thing to remember: we don't want to completely prevent robots from spidering the site; we want people to be able to use g0ggle (etc.) to find information in perlmonks. We're just looking at ways to throttle robots so they don't (inadvertently or otherwise) DDOS the site.

  • Comment on Re^5: Perlmonks site has become far too slow

Replies are listed 'Best First'.
Re^6: Perlmonks site has become far too slow
by soonix (Chancellor) on Oct 22, 2025 at 06:42 UTC

    From the text there, it sounds like more or less any link. However, when looking at the site without logging in, there are a great number of valid links.

    For Perlmonks, it would probably be the Super Search, RAT (at least over a defined nesting depth), and nodelets with dynamic content (e.g. Other Users, CB) that would be "unlinked". But I think these are already blocked for robots.

    So, probably not much to gain here :-/ especially given the fact that this concerns only "A HREF"s, not incoming links from elsewhere (or self-generated ones).

    we don't want to completely prevent robots from spidering the site
    Yes, that's understandable. Don't know how Fossil handles that. OTOH, they seem to be less well known, maybe as a result from exactly that 🙈