in reply to Re^2: mod_perl best practice to consume dynamic file
in thread mod_perl best practice to consume dynamic file

Your OP does not elaborate as to how-and-where the modifications to the JSON come from.   It implies that maybe you intend to cache the data in mod_perl’s persistent memory.   But the trouble will be that none of the web-workers that might exist will safely know about the others, and so you could have inconsistent updates to the data.

Like others have already suggested, I think that I would use a database.   Even if you decide to store JSON in the database.   Be sure to use a database server (and on-disk format, if applicable) that supports transactions so that you can, for example, reliably perform a begin transaction...select...update...commit cycle that will be atomic.

If you wanted to stash the data in server-memory to avoid unnecessary traffic and decoding, you could include a server-timestamp in the table and do something along the lines of select * from json_table where last_update_time > 'the last last_update_time you saw'.   (Also doing this in a transaction, which does a commit even though it doesn’t update anything.)   If the query returns a row, you know it's an updated JSON-chunk that you need to process.   Database transactions using an appropriate level of isolation will ensure that the activities of multiple workers will not interfere with one another or read inconsistent information.   (When you update the table, if it includes a last_update_time field, use the SQL NOW() function so that the server, not you, provides the time-value used.   Then, query to see what it was before committing the transaction.)

If I may be candid, I would abandon the JSON-in-a-file notion, altogether.   Timing-issues would be a royal PITA, and database transactions give you an easy and reliable way to avoid them completely.