in reply to Re: Perlmonks Response Time Bottoming Out
in thread Perlmonks Response Time Bottoming Out

We could block all anonymous access on the servers every other hour as "maintenance downtime" till the problem is fixed.

Nothing against anonymity, but if blocking anonymous access will work, then that should be done immediately.
(Just do it as a permanent arrangement ... if that's what it takes.)

Cheers,
Rob
  • Comment on Re^2: Perlmonks Response Time Bottoming Out

Replies are listed 'Best First'.
Re^3: Perlmonks Response Time Bottoming Out
by Corion (Patriarch) on Sep 24, 2025 at 06:01 UTC

    While I'm certain that the suggestions here are certainly well-meant, I think they are unlikely to have any impact. Many of the Anonymous requests are already served from static pages. Serving an "access denied" page instead of the static page will have very little impact.

    I think the next step will be to separate the machine where logged-in users access the site from the machine where the crawlers roam. But that requires some setup and thinking, and I won't do that until I have reserved enough time to implement this and also deal with the fallout.

    One downside of having the site accessible at all is that people have again a venue to complain about the site responsiveness.

      > One downside of having the site accessible at all is that people have again a venue to complain about the site responsiveness.

      If our site becomes inaccessible, one of the upsides will be there won't be any place to complain about it ;-)

      map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]
      > One downside of having the site accessible at all is that people have again a venue to complain about the site responsiveness.

      I'm not complaining, please see it as a brainstorm of options to pick from.

      > I think they are unlikely to have any impact.

      I'm ignorant about the bot's heuristics. Maybe they act differently on an access denied? ¹

      Some other suggestions you are free to consider:

      I started the chrome-console and looked into the web-response of one static page (ID:100001).

      They have a cache-control max-age=0 indicating that the client shouldn't see them as static.³

      So a bot might conclude it may be worth polling it again and again in short intervalls.

      Various other embedded resources are also loaded with delay, and sometimes those take even longer time to load than the above initiating page

      It seems like they are dynamically pulled from a node for each cached page, even if their content is super static (JS and CSS) and they have no cache control setting.²

      Saying so, I'm quite inexperienced in server settings, cache control and bot behavior , so my observations may be irrelevant and my timings may be limited by the fact that I use a mobile web access.

      Please ignore me if it is so ...

      Cheers Rolf
      (addicted to the Perl Programming Language :)
      see Wikisyntax for the Monastery

      updates

      ¹) better even a HTTP-503 Service Not Available with a retry-after set to 60 min

      ²) When being logged in, my browser is indeed caching those embedded files, not sure whats going wrong here.

      ³) Apparently is at least the google-bot honoring cache settings. Probably delivering the static/cached files with cache-control headers might eventually help reducing load?

Re^3: Perlmonks Response Time Bottoming Out
by LanX (Saint) on Sep 24, 2025 at 10:18 UTC
    > (Just do it as a permanent arrangement ... if that's what it takes.)

    I don't think becoming invisible is a viable option.

    It's already the case that most search engines except google don't index us anymore.

    Cheers Rolf
    (addicted to the Perl Programming Language :)
    see Wikisyntax for the Monastery