in reply to mod_perl best practice to consume dynamic file
Depending on the nature of the JSON file, you might be better off with a database, as using a database means the database will worry about updates to the data and concurrent access.
Otherwise, you can circumvent most problems by updating the JSON file in an atomic fashion, by creating the new version under a temporary name and using the rename system call to move the new version in place.
You should think a bit about whether it is OK for clients to read stale/inconsistent data. If you have a quick client and a slow client, the slow client might receive data that is older than the quick client because the request for the slow client read the JSON file before the change.
If you read the data for every request, you might suffer from the Thundering Herd Problem if many clients hit your server at the same time. It might be good to read the data once and then share it across all the workers instead of re-reading it on every request. Still, even your worker processes / worker threads will read the file once each, but removing that part is likely far more work than suffices.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
| A reply falls below the community's threshold of quality. You may see it by logging in. |