Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hello Monks I have a flast CSV database of 50,000 codes. Once a users has requested a code it then needs to be tagged as used/removed so other users wont be served the same code. I can generate the code and filter it against the pool of codes so I end up with an array of codes minus the one served to that user:
open (DB, "<".$codes_db) || die ("Can't open $codes_db"); while (<DB>) { &cleanData($_); # this removes white space push(@buffer, $_) unless ($_ eq $code); } close (DB) || die ("Can't close $codes_db");
This is where i am stuck. What is the best way to put the @buffer array with all the unused codes back into the code pool or mark them as used? Thanks

2006-11-15 Retitled by planetscape, as per Monastery guidelines
Original title: 'Help with codes'

Replies are listed 'Best First'.
Re: Changing records in a text file
by friedo (Prior) on Nov 15, 2006 at 16:47 UTC

    You want to do atomic updates on a large pool of data -- this sounds like a problem for a database. Something like SQLite, or even a disk-based hash like BerkeleyDB, will be far superior to manually munging individual fields in a giant CSV.

    If you use an SQL-based solution like MySQL or SQLite, then it's as simple as executing "UPDATE codes SET used = 1 WHERE id = 'foobar'" and you let the DB do the hard work for you.

Re: Changing records in a text file
by davorg (Chancellor) on Nov 15, 2006 at 16:45 UTC

    The old and boring way to do it would be to open the file in read/write mode, read the contents into an array, make the changes to the array, rewind the file pointer, truncate the file handle and then write out the array. Oh, and don't forget to flock the file handle to stop anyone else trying to access the file.

    But these days, it's much easier to just use Tie::File.

    --
    <http://dave.org.uk>

    "The first rule of Perl club is you do not talk about Perl club."
    -- Chip Salzenberg

      Thanks, I did try to write them back but the file is too large and every so ofter is just deletes a large chunk of codes.
      open (DB, ">".$codes_db) || die ("Can't open $codes_db"); flock (DB, 2) || die ("Can't lock $codes_db"); foreach $line(@buffer){ print DB "$line\n"; } flock (DB, 8) || die ("Can't close $codes_db"); close (DB) || die ("Can't close $codes_db");
      Is there anything I can do while the file is open in the while loop?
Re: Changing records in a text file
by madbombX (Hermit) on Nov 15, 2006 at 17:12 UTC
    Have you considered using Text::CSV_XS for your manipulation and writing out of the CSV? Text::CSV_XS comes with a method for writing your resultant CSV database back out to a file (or filehandle).

    You can also consider treating it like an SQL db with DBD::CSV depending on you familiarity with SQL.

Re: Changing records in a text file
by SheridanCat (Pilgrim) on Nov 15, 2006 at 17:14 UTC
    Doing this with a text file is going to be dangerous, I think, no matter what method you use. There are going to be concurrency issues unless you're very careful and have complete control of the environment. And even if you have these guarantees today doesn't mean you'll still have them at some later date.

    A proper database using transactions is really the right way to handle this, I think.

      DBD::CSV eases the transition and saves us from reinventing methods for CSV file updates.