in reply to Re: Taking advantage of E2 improvements
in thread Taking advantage of E2 improvements

I think a modern, JS-based site is exactly what the perl programming community needs.

I'm not convinced. Yes, JS has some advantages, probably mainly for the chatterbox, and maybe for an improved posting editor. All other functions on the perlmonks website work just fine without JS.

The main problem with perlmonks is currently the response time, or the lack of any response. We all know that problem, and I think it is driving away the last few remaining users. (When was the last time you saw a posting younger than 12 months with a three-digit reputation? The current best node of the year, Re: Perl Best Practices -- 20 years later, has a rep of 49.) We need something effective to prevent the almost permanent overload, and we need it yesterday. Rewriting the code takes way too long, and when it has finished, there won't be any users to enjoy perlmonks any more.

Perlmonks has already been changed to be a little bit more unfriendly (no more anonymous access to "slow", expensive pages). Let's make it way more unfriendly. Require a valid login for anything. Don't bother using the Everything Engine for that. Let the webserver decide (e.g. by checking if the login cookie exists), and make the webserver redirect anonymous access to a static login page. Create a very small, fast, new program (using FastCGI or anything else faster than CGI) that handles form submission from the static login page. It just needs to verify the login against the database, no dynamic loading of perl code needed. That should make all bots hit just the static login page, and perhaps the login program. Maybe add some Javascript captcha to the login page to make bot access even harder.

<Update:>

Of course, we also need a static page and a small program for creating a new account, or there won't be any more new monks. Again, it should not require much code. Just put the bare minimum data required into the database, set the cookie, and redirect to the monasty gates. Again, make it hard for bots. Require a captcha, and perhaps other counter measures.

</Update:>

If that works, maybe add a whitelist to the webserver for well-behaved bots, that (based on their IP addresses) allows access to the Everything Engine even without the login cookie (still anonymous). That way, the well-behaved bots from the relevant search enigines can still add perlmonks content to ther index.

And when that is stable, we can think about improving and/or rewriting the monasty program code.

Back to the JS idea. I have to work with some "modern" Javascript-based Web Applications. I refuse to call them web pages, because they really are fat clients that load the entire application code in the browser (either open or hidden in a "native" app that just bundles a browser engine and a loader for the web page). The one is "Early", a time tracker. It wastes a lot of screen space and lacks the easiest way to track time: A simple table. The murch worser ones are Jira and Confluence. Basically the same approach, but with an extra game: Every pixel on any page is a little mine. Almost like Mine Sweeper, but you can't flag anything. Click anywhere on the web page and the mine will blow up, changing data that you did not intend to change. The same is true for almost any key, because any key may be a keyboard shortcut to shooting yourself into your foot. And very recently, I encountered "ClickUp". Again, it loads tons of Javascript that is never used, simply because many functions of ClickUp simply do not work in my Firefox.

That's not how Perlmonks should work and behave.

Alexander

--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
  • Comment on Re^2: Taking advantage of E2 improvements

Replies are listed 'Best First'.
Re^3: Taking advantage of E2 improvements
by JayBonci (Curate) on Nov 27, 2025 at 01:07 UTC
    The everything engine is a dog to scale. I've killed off eval(), and am normalizing a lot of the functionality to consistent APIs. Once we have a more skinny featureset, it'll be easier to add engine improvements, like having the nodecache be pre-seeded by a static JSON file, or a memcached/redis caching tier, or porting the site over the FastCGI to be done with mod_perl. It's a lot of cleanout, but it's necessary for the future of the site. The theory is that if the two code bases are close enough, you guys can steal those pre-hardened approaches and benefit.


        --jaybonci