dl748 has asked for the wisdom of the Perl Monks concerning the following question:

I have a database I'm working on. I've looked for maybe an example on the internet but I've not found much. I'm looking for a way to read and write to a database with multiple processes accessing it. I've used the DB_File and then flocked and unflocked the file. That works fine. The problem I'm having is that when another process tries to read the same record that i've written to in the other, I do not get the same data. If it was a new record, it works ok, but if I've modified a record it doesn't get it until I untie and retie the DBM. I am using sync to flush the chache after I write it also. I think the file is cacheing the data. Is there anyway to flush the READ cache? or maybe another type of database I can use that is not server based?

Replies are listed 'Best First'.
(jeffa) Re: database problems
by jeffa (Bishop) on Jun 02, 2001 at 02:49 UTC
    You didn't mention that you _HAVE_ to use DB_File's, so . . .

    Sounds like you need a relational database. DB_File's are fine and dandy, but as soon as you start adding 'users', you really should 'upgrade'.

    As well as being able to handle multiple users reading and writing the data at any time, you also get the added benefit of normalized data. You can join tables together and use SQL to selectively retrieve and sort the data.

    I recommend looking into MySQL and Postgres.

    If you aren't convinced, then please read the words of my friend eduardo: Re: DBI vs MLDBM/GDBM_File.

    Jeff

    R-R-R--R-R-R--R-R-R--R-R-R--R-R-R--
    L-L--L-L--L-L--L-L--L-L--L-L--L-L--
    
      i understand your reasoning. But aren't MySQL and Postgres "servers" that must be installed on my server. All I need to store is a Key=>Pair. I don't need complex database engines. All I was hoping for is one that lock and unlocks but does not cache. Maybe an example. And No I didn't say say I had to use DB_File. I know Unix and Linux have hundreds of files that multiple users read and write from and They don't have this problem nor have to install some server system to do their work for them. I'm thinking there are functions that I can use also.
Re: database problems
by bikeNomad (Priest) on Jun 02, 2001 at 02:52 UTC
    There is a good discussion of locking and DB_File here. You may want to use the built-in locking in the newer BerkeleyDB module instead, as it will scale better for multiple users.
      yeah.. but that has to be compiled onto the system. So if i were to install this on someone elses system. They would have to install and compile BerkleyDB and the entire Berkley Database
Re: database problems
by fs (Monk) on Jun 02, 2001 at 07:30 UTC
    Sounds like you didn't read the DB_File documentation all of the way through - the Hints and Tips section warns of exactly this problem, why it happens, and how to avoid it.
Re: database problems
by Zaxo (Archbishop) on Jun 02, 2001 at 04:47 UTC
    If the second process is jumping in as soon as you clear the lock, it might be enough to call
    system '/bin/sync'; # or maybe system `which sync`;
    before you unlock.

    After Compline
    Zaxo

      that will not work..... both processes have the same file open at the same time... i do a "write" sync" but the other process caches in memory part of the file. so it never gets changed. If i run the sync command that just flushes all the writes waiting on the system.... it doesn't affect what other programs have read already