in reply to Fast Recall
Perhaps the very simplest mechanism would be to write an empty file with the failing IDs as filenames into a local directory when you read them.
Each time you read a successful ID, you can use -e to see if it has had a previous failure. If it has, you can now unlink that file.
Later, you can then use the creation timestamps on the files to delete any that are more than 8 hours old. This could even be done by a separate process that scans the directory on a regular basis via cron.
It's simple, persistant, requires nothing beyond Perl itself, and should be easily fast enough to cater for 6 lookups per second.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Fast Recall
by pemungkah (Priest) on Sep 03, 2010 at 04:18 UTC |