I'm in the design stage for a new application and I wanted to bounce something off the Perl Monks.
I need to have multiple people and applications access the same data set and I need to manage data writes effectively so that one user doesn't write over someone else's changes.
The core of the application would be a Perl web service. I'm assuming that I'll write it with mod_perl because that's what I've done in the past and it worked really well.
But with this application, some clients such as an Android device will have a local copy of a subset of the data. So in theory multiple users could have a cached copy of the data and they could both save changes to the data at roughly the same time and the last one to write would win and overwrite the other's changes.
I'm not going to have hundreds of users accessing the same data, but more like 10 at the most, but I don't want confuse the user. I want some of the clients to have their own local copies of the data so that if they go offline they can still use the application, but when it comes back online to resync their changes with other user's changes.
My question is, is there a Perl module or something Perl can do that can help me deal with this data race/synchronization problem. And if not a Perl solution, perhaps someone could point me to a document online with a design pattern that will work for me.
It's not that I don't have some idea of how to solve this, but I'm never disappointed with the answers I get from Perl Monks.
In reply to How to deal with data race issues with Perl? by halfbaked
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |