in reply to Re: Scalable application with perl
in thread Scalable application with perl

Sounds like a fairly simple problem after all maybe. 12,000 users, but what are they doing during that 90 minutes? One GET and one POST each? Dozens? A hundred? Can users undo answers or backpage? Issues like this can change a simple problem into a terribly complicated one quickly.

Vanilla nginx can serve pre-built or cached GETs of form pages to the tune of 12K per second just fine if you have the bandwidth. The incoming POSTs have to go through the application though and you won't know how fast that is until you build or at least prototype it. uwsgi has deep and powerful controls for how many worker processes to run and can even make changes dynamically. nginx has an experimental perl embed module now too which could be amazing (I haven't tried it yet). Anyway if you have the RAM and a halfway decent CPU I imagine you'll be able to handle the use you're talking about unless a lot of the users are banging on submit the whole time.

Tips–

Replies are listed 'Best First'.
Re^3: Scalable application with perl
by tangent (Parson) on Apr 26, 2016 at 19:34 UTC
    Extra clock savings: don't touch the DB at all during the exams. Dump the validated submissions as single field delimited strings to a flat file or NoSQL or similar.
    Excellent tip. You could use a key/value store like DB_File to save each submission. If you can construct the form "action" attribute to give it a unique path (perhaps using Javascript on the client side) then you would not even have to parse the query arguments:
    <form action="myapp/user1/question1"> <form action="myapp/user1/question2"> <form action="myapp/user2/question1">
    Then use $ENV{'PATH_INFO'} as the key ('/user1/question1'), and read from STDIN directly to get the raw value to store.