The problem you may have is when two or more of these all try to run at once. The usual solution to that is based on flock.
Another way to deal with concurrency that I've used in the past is to write each item to a unique file in a particular directory. Then a single cron/batch job comes along later and processes each file, unlinking them as they're finished. The only race condition is if the batch processor tries to work on a file the writer hasn't finished writing yet. I'd avoid that by not working on any file that's less than (say) a minute old.
In reply to Re: Hit tracking optimization...
by kyle
in thread Hit tracking optimization...
by cosmicperl
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |