I am (probably) going to end up with a script that is going to be repeatedly hit by a lot of clients, something along the lines of 2-4 times a minute, for upwards of hours at a time. The data the script passes back is likely to be very similar, with possible minor changes. I have not tested this, but it seems like a bad idea, or atleast it could be improved. I first thought of caching, but i ran into the issue of the data that is coming back having small changes. But these changes are vitally important, and they must be sent back. So how could a caching system work around that? And how could i make sure the clients continue to receive updated information, i.e. the browser or isp doesnt decide to implement a static cache someplace and just give back the same results?