Storable is probably not optimized for low memory consumption. Also since you freeze and then use bzip on that (did you use a pipe for that, i.e. the command line utility bzip2?) you probably have the unfrozen, the frozen and the compressed in memory at the same time
A more memory conserving algorithm would for example freeze one column, push that into the compression pipe and clear that column. Then on to the next column
Whether MLDBM is a good option depends very much on the data and what you need to do with it. For example if you need random access to some but not all of the array elements (or some columns but not all) a database like NDLBM would be an execellent solution. If instead, as you seem to indicate, your processing needs always all of the data (for a fourier transform or a matrix mulitplication or ...) a database is a waste of time. If instead you are searching for patterns, it might be possible to do some preprocessing and store the data as a hash with all possible subpatterns as keys and the locations as data. You see there are many possibilities.
PS: You are aware that a few million rows of in average 100 numbers already use 2G memory? Perl stores lots of internal information about a variable
In reply to Re^3: Storing large data structures on disk
by jethro
in thread Storing large data structures on disk
by roibrodo
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |