in reply to unlinking old files

Most likely, you know that checking through all files each time your CGI script is run will be quite slow and you also know that you will most likely run into access permission problems, as your webserver runs under another user than you are and that user might not have the right to delete files. A cron job would be the cleanest solution in my opinion, but you don't want a cron job.

Perl has some nice "functions" for checking file times, namely -M (age of file in days), -A (last access to file in days) and -C (change to inode).

To use them in your code at startup, I would do more or less something like the (untested) following :

use strict; use File::Find; my @files; # First we fill @files with the (full) path of # all files in the directory find( sub { push @files, $File::Find::name }, "/usr/home/foo/files/" ) +; # Now we check each file if it has become too old foreach (@files) { # If our file is older than one day, we will (try to) delete it if (-M > 1) { unlink $_; # no error checking here ... }; }; ... The rest of your script goes here ...