Hello,

Looking at the webserver of PerlMonks it appears to be too old? Apache/1.3.29 while the newest 1.3 version is 1.3.42 which fixes some memory corruption and other security issues Apache 1.3 Change Log so upgrading it might help?

Another thing that I've noticed which if fixed will improve the speed and lower the load significantly (PerlMonks currently doesn't gzip its contents, So enabling mod_gzip on Apache and gzipping text/html, text/javascript and other text types will make page loads faster which in turn will lower the server load and save some traffic).

Hopes you'll consider my suggestion, As enabling gzip won't take much time, like 10 minutes only ...but as for upgrading Apache it will have to be scheduled for later time.

Update: I'm getting too many -- for this ... Anyway implement it and if it turns out to be a bad idea or making things worse I'll pay a $50 USD administration fee for the time wasted by the system admin who's implementing it and removing it.

Replies are listed 'Best First'.
Re: Enhancing Speed
by Corion (Patriarch) on Aug 18, 2010 at 15:36 UTC

    There is an instance of Perlmonks that runs under mod_perl 2 already, at http://qs1969.pair.com. It still has some quirks, because from time to time, DBI does not want to play anymore when it gets timeouts, but other than that it is fairly stable.

    However, most of Perlmonks slowness is CPU and database bound. How do you think that gzipping content helps reduce the CPU time or database time used by Perlmonks?

      When you gzip the content, pages will take less time to download which means Apache will be able to serve more in less time.

      I kept hearing that gzipping content stress the CPU, but it's a myth based on my experience (I use mod_deflate with Apache 2.2).

      I don't have the statistics of your site, nor the server specifications ... but I've managed to lower the load of big websites (uses PHP not Perl) by gzipping content.

      It will take 10 minutes from your time ... And after 2 days if you think it did not make things better you can turn it off (unload the module), restart apache and that's it ... you're not modifying any part of your code.

      You don't have to recompile apache in order for it to work.y

        I don't think that will work. The machine already spends 100% of the CPU time creating the page. How will making it spend time on compressing the content make the site faster?

Re: Enhancing Speed
by ikegami (Patriarch) on Aug 18, 2010 at 15:48 UTC

    Another thing that I've noticed which if fixed will improve the speed and lower the load significantly (PerlMonks currently doesn't gzip its contents,

    Doing more work increases the CPU load, and I suspect any speed difference will be negligible. The replies are rather short, so not much time is being spent transmitting the reply. The time is spent generating the reply (many database trips), so increasing the CPU load could actually have a negative impact on speed.

    A reply falls below the community's threshold of quality. You may see it by logging in.
Re: Enhancing Speed
by marto (Cardinal) on Aug 18, 2010 at 15:39 UTC

    "Hopes you'll consider my suggestion, As enabling gzip won't take much time, like 10 minutes only ...but as for upgrading Apache it will have to be scheduled for later time."

    Gzipping content requires more CPU time, IIRC part of the sites performance problems relate to CPU usage. Sure 10 minutes may not seem like a long time to enable something, however it could make things worse rather than better.

      How much CPU time do you expect to use to gzip a 70KB page ?

      I tried to gzip a 200KB file on my Core2Duo laptop ... it toke less than 1 second. Beside when you take 1 sec to deliver a 10KB page instead of 70KB which will load faster for the client and release the apache child to serve someone else I'm sure you'll be saving CPU time.

        "How much CPU time do you expect to use to gzip a 70KB page ?"

        Wouldn't that depend on various factors?

        My point was that your suggested ten minute change could end up doing more harm than good, as others have also suggested.

        The metric in your example does not take into consideration any of the factors including the servers CPU speed, the servers memory, the applications (Perlmonks custom version of everything and the DB queries) or the load. As you say "I don't have the statistics of your site, nor the server specifications".

        Your suggestion that compressing data on your laptop (with a relatively modern CPU) is a comparable benchmark for running compression on old servers isn't, IMHO, a sensible approach to benchmarking the impact nor justification for someone taking ten minutes out of their day to implement your untested change, which may take longer to recover from should the resulting performance be unfavourable.

        Failure to consider your environment can lead to trouble.