Ntav has asked for the wisdom of the Perl Monks concerning the following question:

I have what I guess must be quite a common situation.
I need to frequently store data to a datadumped structure (must be datadumped since it is fed to C++), but don't want to go through the process of eval-ing it, updating the structure and redumping every time since the structure is fairly massive.
At the moment I log the data to another file until I have N records then eval and update these together.
An option I have experimented with is updating the datadumped structure as text (i.e. without eval-ing it), but I'm aware that this is a somewhat shady programming practice.
I would like the monks advice as to whether there is a good way to update such a structure, at the margin, and efficiently. Thanks for any help,
Ntav.
humble initiate
  • Comment on Frequently updating large datastructures

Replies are listed 'Best First'.
(Ovid) Re: Frequently updating large datastructures
by Ovid (Cardinal) on Sep 01, 2001 at 19:48 UTC

    What you are describing is a database. If you have access to one, you should probably use that. If you can only work with flat files, I think DBD::CSV might be a reasonable option. Then, you'd have to figure out how to handle it in C++, but that's another story. I assume that C++ has CSV libraries, right?

    Cheers,
    Ovid

    Vote for paco!

    Join the Perlmonks Setiathome Group or just click on the the link and check out our stats.

Re (tilly) 1: Frequently updating large datastructures
by tilly (Archbishop) on Sep 02, 2001 at 18:37 UTC
    As Ovid said, this sounds like a database.

    But one option that might be a fit is to use Berkeley DB. That has both Perl and C++ interfaces (though you will have to pay a licensing fee to use it in a non-Open Source C++ program) and allows you to simultaneously access a very large (though simple) data structure in both languages and update it from either.