in reply to Efficient memory

Try Storable - it stores data in a binary format which is more compact than Data::Dumper.

Having said that, are you sure that the amount of data created by the serialisation process is what is causing the problem? There are a number of methods of checking, such as profiling.

In short, have you definately isolated the code causing the bottlenecks or are you working on an assumption?

rdfield

Replies are listed 'Best First'.
Re: Re: Efficient memory
by PetaMem (Priest) on Nov 05, 2002 at 10:00 UTC
    Hi,

    actually - don't overestimate the power of Storable. Here are some benchmarks we did, because we tried to harness storables power and failed(?):

    We have a large hash. Where there is a key and the value is a list of sets of strings. We had an OLD method to store them:

    Extracting these strings (iterating the whole data structure), concatenating them in a sensible way (unique delimiters) and storing them to disk.

    And we tried a new way: Store them via Storable.

    Storable was about 7 times slower than the old method when saving the data, about 5 times faster when retrieving the data and took about 10 times more space on disk.

    For our special purposes, we decided to stick with the old method, but you may find it worth to try storable. It's way easier and consistent. On the other side - even though it saves everything in a binary form it still needs much space on disk as it saves the complete data structure too.

    You might save that space, when rebuilding that data-structure on the fly while reading the data in.

    Bye
     PetaMem

      The relevant comparison here is Storable vs. Data::Dumper. Storable is always faster than Data::Dumper.

      yes, but our large hash was currently an object and so Storable store whole object instead of the OLD method which stores only data...