oyse has asked for the wisdom of the Perl Monks concerning the following question:
Hello fellow monks,
I am current implementing a system where a CGI requests can initiated a long running operation. This operation involves sending or retrieving data from other systems using web services. It don't know exactly the amount of time the operation will take, but I expect that it can take anything from 1 - 30 minutes split over several smaller operations.
A simple way to implements this seems to be to put all the operations in a queue that is implemented as a database table. A separate task on the web server will be responsible for checking the queue and if there are any operations in the queue, it will perform the operation. If there are no more operations in the queue it will just wait.
So far so good. The problem is implementing the task on the web server that will check the queue. I could implement this as a client (CGI request) and server (task checking queue) or as a form of periodic script initiated by some service, but what is the common way to do this? If anyone has experience with this type of thing and know of any best practices I would like to hear them before I start.
Some requirements that must be taken into account to some degree:
BTW, the application will run on Windows and IIS, so Unix solutions will not help me much.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Handling long running operations started by CGI requests
by fenLisesi (Priest) on Jun 15, 2009 at 16:56 UTC | |
Re: Handling long running operations started by CGI requests
by afoken (Chancellor) on Jun 15, 2009 at 18:24 UTC | |
by oyse (Monk) on Jun 18, 2009 at 06:21 UTC |