in reply to exec command taking ages
To my way of thinking, a web-page should never “wait for” any long process to complete. It can be used to submit a request, and it can be used to query to see whether the request has completed, and it can be used (if it has completed) to retrieve the results therefrom. But the web-page should never be the actor in this play.
The user submits a request. This request is carefully validated and then handed-off to a background job-processing system, which gives the user some kind of token to follow-up on it with. Maybe, when the job is done, an e-mail is sent. The user returns, presents the token, and gets the result.
The request-processing should be completely independent of the web-page mechanism, and thus immune to the number or the frequency of web-page hits. It, too, should validate every request that it receives, and it should refuse to accept too-much work. Likewise, it should refuse to attempt to perform too-many work requests simultaneously. There are plenty of good transaction-processing frameworks out there, on CPAN and elsewhere. (For that matter, “soup-to-nuts request processing frameworks” are also available off the shelf. The costs of building such a mechanism from-scratch are not insignificant, and should be avoided if possible.)
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: exec command taking ages
by Anonymous Monk on Nov 24, 2010 at 15:57 UTC |