in reply to Storing large data structures on disk

The common technique for large datasets is to use a database.

In the case you describe a simple CSV file would seem to do it too, and will probably use less memory in the process. Or maybe a compatct, home-made binary format, to avoid excessive disc usage. pack and unpack for the win.

Perl 6 - links to (nearly) everything that is Perl 6.
  • Comment on Re: Storing large data structures on disk

Replies are listed 'Best First'.
Re^2: Storing large data structures on disk
by roibrodo (Sexton) on May 31, 2010 at 15:36 UTC
    Thank for the reply.

    I followed your hint and read perlpacktut. Since my array is 2d and each row has a different number of values (columns), doesn't it mean it will require a lot of programming overhead defining the templates?

    Also, I would prefer a generic method that will work for different data structures (e.g. hashes of ...).

    I will continue on reading om pack and unpack and will appreciate any pointers for some examples to the use of the for the purposes I described.
      Since my array is 2d and each row has a different number of values (columns), doesn't it mean it will require a lot of programming overhead defining the templates?

      It can be as simple as representing each row by an integer, which holds the number of numbers in that row, and then the row in fixed with (for example always 4 byte per number).

      But that all depends on how flexible you want the data retrieval to be, how robust it must be etc.