in reply to CGI scripting and multiple processes

Hrmm, while your approach would certainly work, I would hold some concerns as to the scalability of this approach under high server load. From the description of your problem you might do better to look at a caching solution which automatically encompasses this type of object expiry and clean-up - The Cache modules provide a means to perform just what you require, persistent data with a specific expiry time.

An example of usage which creates a Cache::FileCache cache in /tmp/cgi-data with a default expiry time of 15 minutes and auto-purge set (upon object retrieval) ...

use Cache::FileCache; my $cache = Cache::FileCache->new({ 'cache_root' => '/tmp/cgi-cache', 'default_expires_in' => 900, 'auto_purge_on_get' => 1 }); $cache->set( $key, $data ); . . . $cache->get( $key );

The usage of the Cache modules is relatively straight-forward, incorporating clear and self-explanatory method names, and the provided documentation is excellent. I would strongly recommend having a look at this set of modules as this approach provides you with a pre-built method to implement data persistence with expiry without having to incorporate forking and clean-up within your script.

 

Replies are listed 'Best First'.
Re: Re: CGI scripting and multiple processes
by rob_au (Abbot) on Jun 15, 2002 at 08:06 UTC
    To follow-up on this, I have just come across a new module on CPAN which may also suit this task, File::CacheDir - This module is touted as able to keep track and clean-up temporary files, quickly and without the use of a crontab. I have not however as yet had a chance to use this module or explore it in any detail.