advait has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,
I have a website which does heavy duty statistical computing on server. When the number of jobs requests from web are high the server becomes very slow. Is there any way that I can limit the number of jobs to server from my website. For example just allow 5 jobs to run at a time and other requests get in queue. I have no idea about networking or queuing
Please help
Thank you

Replies are listed 'Best First'.
Re: Limiting the jobs from web on server
by jhourcle (Prior) on Mar 05, 2008 at 19:14 UTC

    It's not a perl-specific answer, but when I've run similar jobs, I've used CGIwrap so that I can set limits on a per-user basis. (and then just run the jobs as a specific user)

    See the option:

    --with-rlimit-nproc=COUNT
    limit number of processes with setrlimit

    In this specific case, it won't queue things up, but'll reject requests if too many come in at once. (depending on your users, this might not be acceptable). I like it because I can get it to kill processes that run too long, and keep 'em from sucking down all of the server's memory.

Re: Limiting the jobs from web on server
by locked_user sundialsvc4 (Abbot) on Mar 05, 2008 at 23:01 UTC

    What you really need to do here is to ... get to know networking and queueing. :-)

    In other words, your website would be used to create and submit jobs to a batch-processing system on the same or on a different server, and to monitor the status of the jobs as they are completed. There are many batch and cluster management systems out there, and your website would serve as an interface to one.

    Furthermore, you will discover that nearly all of these batch management systems already have a web-interface, so the entire need for your existing website may prove to be ... nonexistent.

      unless you're just trying to learn :)