tphyahoo has asked for the wisdom of the Perl Monks concerning the following question:
But what happens if I run multiple crawlers, all writing their data to the same file? Assuming no data conflicts -- such as updating the same hash key with different data -- am I going to be okay?
If there is data conflict, does MLDBM give you feedback that there was a conflict, or just update with the first value then the second value, or what?
Data-structure wise, I don't think I need an RDBMS system. But concurrency wise?
If it seems MLDBM is going to give me trouble, can someone suggest an alternative that handles concurrency with hash serialization better?
UPDATE: Followup-ish post at Can I serialize an object and have it remember it's type?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: How's mldbm with concurrency?
by jZed (Prior) on Mar 15, 2005 at 16:57 UTC | |
|
Re: How's mldbm with concurrency?
by perrin (Chancellor) on Mar 15, 2005 at 18:10 UTC | |
|
Re: How's mldbm with concurrency?
by merlyn (Sage) on Mar 15, 2005 at 20:23 UTC |