Several times today (11:24, 18:11, 20:43 Europe/Prague) I got "Unable to connect" from firefox when trying to connect to RAT. After several minutes, the page was back up. Is it a new problem, or one of the old problems reappearing?

map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]

Replies are listed 'Best First'.
Re: Unable to connect
by hippo (Archbishop) on Mar 22, 2025 at 16:10 UTC

    It's just done the same again for about 10 minutes (came back at 16:01 UTC). It is an immediate rejection of the connection, as if the web server was not running. I've not seen that before the past couple of days so it looks like a new problem to me.

    This applies to any page, of course, not just RAT.


    🦛

      I'm going to go out on a limb and suggest, based on recent reports from elsewhere that this could be being caused by the ever increasing swarms of content scrapers specifically feeding AI slop factories while actively ignoring and working around ant attempts to slow them down.

        I hate to say it but if this is the case then a human verification is needed WHEN abnormal page visiting patterns (APVP) are detected. Can we check if there is any APVP in the logs around this short time interval mentioned by choroba and others?

        I would consider a normal visiting pattern the following: open 5,10 posts from newest nodes (i am sending them to different tabs) on a short burst (e.g. when first landing on newests nodes or RAT). Then inactivity (voting/commenting does not count) while reading or doing other things. I don't think even someone who has not logged in for a year would open hundreds of posts in one short burst to read them all in a ... few days. Perhaps we could be allowed to read only 10 posts/day from the distant past. Of course this entails a cookie for anyone visiting not only for those logged in. Or keeping a track of what each IP (not user) does what and how often. thinking out loud.

        Unfortunately. I had to tighten the screws on my private server as well. Most of those scrapers are really, really, dumb, too. When encountering a public repository (both git and mercurial), instead of just pulling the repo (a rather efficient operation), they just follow through the web pages and generate every page in every which way. Still working on some smarter rules, but so far i managed to reduce traffic to my server by (very roughly) 90% without affecting most legitimate users.

        There are still a few things i want to implement to detect bot activity even better and have to ability to automatically block specific subnets when bot activity is detected from those IP's. But that's all very specific to my private server and unfortunately wont be applicable to the monastery.

        PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP
        Also check out my sisters artwork and my weekly webcomics
Re: Unable to connect
by Anonymous Monk on Apr 12, 2025 at 10:54 UTC
    Ludicrously slow again right now. This has been happening for months now and is seriously hurting the usability of the site as a resource for Perl.
      I wasn't able to connect today at 16:36 CEST.

      map{substr$_->[0],$_->[1]||0,1}[\*||{},3],[[]],[ref qr-1,-,-1],[{}],[sub{}^*ARGV,3]