in reply to Can your site handle this?

Can your site handle this?

Yes it can. Quite nicely in fact, running here locally on my laptop.

My one publicly visible (private) webserver... not so much. But that's not actually a software problem: My webserver, running in VirtualBox on an ancient hardware has just 150MB or so of free RAM without any connections and starts swapping - everything goes down from there. Replacement hardware is on its way, though.

Not that it matters of course, my wikicables database gets something like one visitor per day.

All my important services run on real hardware with enough power to go to 1024+ threads without too much trouble; just takes like a second or so longer to display a dynamic page. Although i must admit it gets quite noisy when the whole kit and caboodle is spinning up its fans and grinding its database harddisks - until the automatic network defense kicks in and blacklists the attacker.

It works without all that fancy Plack and aXML stuff, just plain old HTTP::Server::Simple::CGI::PreFork with a bit of Maplat magic thrown in. Of course, using a good database (PostgreSQL), sane database layout and memcached also helps a lot ;-)

Don't use '#ff0000':
use Acme::AutoColor; my $redcolor = RED();
All colors subject to change without notice.

Replies are listed 'Best First'.
Re^2: Can your site handle this?
by Anonymous Monk on Nov 06, 2011 at 00:51 UTC

    Oh come on stress testing against Localhost is entirely unrealistic. For a start you've removed all network I/O limitations, and for seconds unless your renting a beefy server your likely to have far more available RAM on a home system, especially these days.

    You mention swapping, and your quite right, the moment your server starts grinding in virtual memory it's game over.

    Any large scale site these days runs on servers with dozens or hundreds of processors and many gigabytes or even terabytes of ram.

    When your running out of ram it doesn't matter how clever your algorithm is and how efficient it is, it's still going to grind to a horrible slow halt when the memory is maxed out.

    That's why I argue that memory usage is far more important these days in building scalable solutions than processing power is, where the same was not true just a few years ago when most of the solutions we have available (the ones which people religiously worship as being beyond question the only solutions worth using) were written.

    I read somewhere, and I'm sorry I can't remember where to give citation, that HTTP::Engine, uses 15mb of ram per child process to do what it does, and that probably made perfect sense in 2006... It's designed for machines where memory usage was less important than processor speed, and that's why when the hardware changes specification new solutions are needed.

    That in itself is proof of the validity of reinventing wheels when needed, because the road itself has changed.

    Perhaps in the future CPU speed increases will falter whilst memory chip density will accelerate, then the situation shifts again, but in the meantime, unless you have a big-iron server budget your server is far more likely to run at peak efficiency using software which uses more processor power but only consumes 2mb of ram per child process rather than 15mb of ram per child process and leaves the CPU idling for most of its cycles.

    This is also a generic argument for the comeback of dynamic languages in general, since they are always more processor intensive than their fully pre-compiled counterparts, a fact that simply doesn't matter here in late 2011.

      Oh come on stress testing against Localhost is entirely unrealistic. For a start you've removed all network I/O limitations,

      You are quite right for the most part. First of all, my DSL modem would go up in flames before even a cheap rent-a-server service would notice any relevant increase in traffic. So, for real-life testing you need a decent internet connection for a start. But then, yes, testing makes a lot more sense.

      But testing against localhost is also a good idea. You might notice race conditions and things like that a lot easier. At least, i did.

      and for seconds unless your renting a beefy server your likely to have far more available RAM on a home system, especially these days.

      When doing production critical services, i usually put my own servers in a colocation. While it's certainly more work, it usually pays (for me) in the long run. Especially when upgrade time comes around. But since i do (mostly) in-house stuff, your situation is probably completly different from mine...

      Don't use '#ff0000':
      use Acme::AutoColor; my $redcolor = RED();
      All colors subject to change without notice.
        your situation is probably completly different from mine...

        Yeh, sounds like it... I'm unemployed and I can only afford a basic server on a budget of about $20 a month. I am therefore forced to make the absolute most out of very limited hardware, at least until one of my sites becomes financially productive when I will be able to scale my hosting plan up.

        The practical upshot of being so highly restricted is that the code which runs fast on such a tiny box is going to kick arse when it gets run on a decent size box.

        I once had access to some IBM Big Iron, I have no idea how much ram it had or how much processing power as the service was provided on the basis of price per megabyte of storage. That was a few years ago now and at the time I couldn't see any problem with the efficiency of my system.

        It wasn't until quite recently when contemplating what would be needed to run a massive-multiplayer online game that I realised the software I had was not going to be anywhere near fast enough for the task given the size of server I can afford, so the quest to speed it up began and that's why aXML got optimised and married to Plack, with the result being that now it's several orders of magnitude faster and even a poxy little $20 a month server is a sufficient basis for several hundred concurrent users... and my path is clear to develop the game I have in mind and have wanted to get stuck into writing for a long time now.

        You know what aXML is a stupid unimaginative name... I really need to find a better one. I have been toying with "Diamond", or maybe "Sapphire". Either would be better and given that we already have "Perl" and "Ruby", the precious gem moniker seems to fit quite nicely with current trends.