Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
Hello fellow Monks,
I have a general implementation/design question that I has been troubling me. We are setting up a web-site that will provide a free resource to researchers in our institution. There are three parts to this:
The part that is troubling me is #2. The data-processing can be pretty computationally intensive. I would prefer to "queue" requests, so that they get processed in serial rather than in parallel. It seems that doing things that way would break the underlying model.
The best option I see is as follows:
Once the parameters are written to text, a cron job (okay, this is on WinXP so I guess it would have to be a service?) regularly checks the textfile to see if there are new records to be run. If so, it processes them and dumps the results to a text-file.
I'm not a huge fan of the above strategy because, amongst other things, it makes me go and rewrite the wrapper, which is already written in a standalone, non-CGI version. Are there any other standard solutions to this kind of problem?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Queuing Input for Serial Processing
by bean (Monk) on Aug 25, 2003 at 18:29 UTC | |
|
Re: Queuing Input for Serial Processing
by LameNerd (Hermit) on Aug 25, 2003 at 17:33 UTC | |
by Anonymous Monk on Aug 25, 2003 at 17:40 UTC | |
by LameNerd (Hermit) on Aug 25, 2003 at 17:49 UTC |