At the risk of stating the obvious, Perlmonks response time has bottomed out again. For the last several days, 9 out of 10 page requests return with "perlmonks.org took too long to respond." Does anyone have any ideas how to solve this?

P.S. It took the better part of an hour just to submit this post.

"It's not how hard you work, it's how much you get done."

Replies are listed 'Best First'.
Re: Perlmonks Response Time Bottoming Out
by LanX (Saint) on Sep 23, 2025 at 15:53 UTC
    You personally can try to deactivate your nodlets in Nodelet Settings, this could reduce some overhead to achieve a response before timeout. (credits to kcott)

    On server side:

    Apart from the ideas with reverse proxies dealing with anonymous clients... (Which would take longer to implement).

    We could block all anonymous access on the servers every other hour as "maintenance downtime" till the problem is fixed. ¹

    As an extended idea

    In these windows individual monks could test private proxies and probably share the settings.

    Personally I have left a trove of content here over the years. I kind of need a backup.

    Saying so I realize I could try to poll the way back machine. Hmm 🤔

    Cheers Rolf
    (addicted to the Perl Programming Language :)
    see Wikisyntax for the Monastery

    Update
    ¹) just checked that it's possible to have rewrite rules in Apache based on hour and cookie content. Didn't try to combine both tho, but I bet that's possible too.
      We could block all anonymous access on the servers every other hour as "maintenance downtime" till the problem is fixed.

      Nothing against anonymity, but if blocking anonymous access will work, then that should be done immediately.
      (Just do it as a permanent arrangement ... if that's what it takes.)

      Cheers,
      Rob

        While I'm certain that the suggestions here are certainly well-meant, I think they are unlikely to have any impact. Many of the Anonymous requests are already served from static pages. Serving an "access denied" page instead of the static page will have very little impact.

        I think the next step will be to separate the machine where logged-in users access the site from the machine where the crawlers roam. But that requires some setup and thinking, and I won't do that until I have reserved enough time to implement this and also deal with the fallout.

        One downside of having the site accessible at all is that people have again a venue to complain about the site responsiveness.

        > (Just do it as a permanent arrangement ... if that's what it takes.)

        I don't think becoming invisible is a viable option.

        It's already the case that most search engines except google don't index us anymore.

        Cheers Rolf
        (addicted to the Perl Programming Language :)
        see Wikisyntax for the Monastery